00:00:00.002 Started by upstream project "autotest-per-patch" build number 127166 00:00:00.002 originally caused by: 00:00:00.003 Started by upstream project "jbp-per-patch" build number 24314 00:00:00.003 originally caused by: 00:00:00.003 Started by user sys_sgci 00:00:00.082 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.082 The recommended git tool is: git 00:00:00.082 using credential 00000000-0000-0000-0000-000000000002 00:00:00.084 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.136 Fetching changes from the remote Git repository 00:00:00.138 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.189 Using shallow fetch with depth 1 00:00:00.189 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.189 > git --version # timeout=10 00:00:00.234 > git --version # 'git version 2.39.2' 00:00:00.234 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.269 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.269 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/changes/08/24308/2 # timeout=5 00:00:05.284 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:05.296 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:05.309 Checking out Revision 698c1d41a6b0b22851122379e9a38009b28d0171 (FETCH_HEAD) 00:00:05.309 > git config core.sparsecheckout # timeout=10 00:00:05.321 > git read-tree -mu HEAD # timeout=10 00:00:05.339 > git checkout -f 698c1d41a6b0b22851122379e9a38009b28d0171 # timeout=5 00:00:05.364 Commit message: "jjb: unlink autotest job configs" 00:00:05.364 > git rev-list --no-walk bd3e126a67c072de18fcd072f7502b1f7801d6ff # timeout=10 00:00:05.478 [Pipeline] Start of Pipeline 00:00:05.493 [Pipeline] library 00:00:05.494 Loading library shm_lib@master 00:00:05.494 Library shm_lib@master is cached. Copying from home. 00:00:05.510 [Pipeline] node 00:00:05.523 Running on CYP6 in /var/jenkins/workspace/crypto-phy-autotest 00:00:05.524 [Pipeline] { 00:00:05.535 [Pipeline] catchError 00:00:05.537 [Pipeline] { 00:00:05.546 [Pipeline] wrap 00:00:05.552 [Pipeline] { 00:00:05.561 [Pipeline] stage 00:00:05.562 [Pipeline] { (Prologue) 00:00:05.755 [Pipeline] sh 00:00:06.042 + logger -p user.info -t JENKINS-CI 00:00:06.058 [Pipeline] echo 00:00:06.059 Node: CYP6 00:00:06.066 [Pipeline] sh 00:00:06.367 [Pipeline] setCustomBuildProperty 00:00:06.379 [Pipeline] echo 00:00:06.380 Cleanup processes 00:00:06.384 [Pipeline] sh 00:00:06.671 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.671 685789 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.685 [Pipeline] sh 00:00:06.970 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:06.970 ++ grep -v 'sudo pgrep' 00:00:06.970 ++ awk '{print $1}' 00:00:06.970 + sudo kill -9 00:00:06.970 + true 00:00:06.982 [Pipeline] cleanWs 00:00:06.991 [WS-CLEANUP] Deleting project workspace... 00:00:06.991 [WS-CLEANUP] Deferred wipeout is used... 00:00:06.997 [WS-CLEANUP] done 00:00:07.001 [Pipeline] setCustomBuildProperty 00:00:07.015 [Pipeline] sh 00:00:07.298 + sudo git config --global --replace-all safe.directory '*' 00:00:07.378 [Pipeline] httpRequest 00:00:07.423 [Pipeline] echo 00:00:07.425 Sorcerer 10.211.164.101 is alive 00:00:07.433 [Pipeline] httpRequest 00:00:07.439 HttpMethod: GET 00:00:07.440 URL: http://10.211.164.101/packages/jbp_698c1d41a6b0b22851122379e9a38009b28d0171.tar.gz 00:00:07.440 Sending request to url: http://10.211.164.101/packages/jbp_698c1d41a6b0b22851122379e9a38009b28d0171.tar.gz 00:00:07.443 Response Code: HTTP/1.1 200 OK 00:00:07.444 Success: Status code 200 is in the accepted range: 200,404 00:00:07.444 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_698c1d41a6b0b22851122379e9a38009b28d0171.tar.gz 00:00:09.100 [Pipeline] sh 00:00:09.392 + tar --no-same-owner -xf jbp_698c1d41a6b0b22851122379e9a38009b28d0171.tar.gz 00:00:09.410 [Pipeline] httpRequest 00:00:09.442 [Pipeline] echo 00:00:09.443 Sorcerer 10.211.164.101 is alive 00:00:09.450 [Pipeline] httpRequest 00:00:09.455 HttpMethod: GET 00:00:09.456 URL: http://10.211.164.101/packages/spdk_70425709083377aa0c23e3a0918902ddf3d34357.tar.gz 00:00:09.457 Sending request to url: http://10.211.164.101/packages/spdk_70425709083377aa0c23e3a0918902ddf3d34357.tar.gz 00:00:09.470 Response Code: HTTP/1.1 200 OK 00:00:09.471 Success: Status code 200 is in the accepted range: 200,404 00:00:09.472 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_70425709083377aa0c23e3a0918902ddf3d34357.tar.gz 00:01:07.702 [Pipeline] sh 00:01:07.992 + tar --no-same-owner -xf spdk_70425709083377aa0c23e3a0918902ddf3d34357.tar.gz 00:01:11.342 [Pipeline] sh 00:01:11.630 + git -C spdk log --oneline -n5 00:01:11.630 704257090 lib/reduce: fix the incorrect calculation method for the number of io_unit required for metadata. 00:01:11.630 fc2398dfa raid: clear base bdev configure_cb after executing 00:01:11.630 5558f3f50 raid: complete bdev_raid_create after sb is written 00:01:11.630 d005e023b raid: fix empty slot not updated in sb after resize 00:01:11.630 f41dbc235 nvme: always specify CC_CSS_NVM when CAP_CSS_IOCS is not set 00:01:11.644 [Pipeline] } 00:01:11.662 [Pipeline] // stage 00:01:11.670 [Pipeline] stage 00:01:11.672 [Pipeline] { (Prepare) 00:01:11.688 [Pipeline] writeFile 00:01:11.704 [Pipeline] sh 00:01:11.988 + logger -p user.info -t JENKINS-CI 00:01:12.003 [Pipeline] sh 00:01:12.346 + logger -p user.info -t JENKINS-CI 00:01:12.360 [Pipeline] sh 00:01:12.649 + cat autorun-spdk.conf 00:01:12.649 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.649 SPDK_TEST_BLOCKDEV=1 00:01:12.649 SPDK_TEST_ISAL=1 00:01:12.649 SPDK_TEST_CRYPTO=1 00:01:12.649 SPDK_TEST_REDUCE=1 00:01:12.649 SPDK_TEST_VBDEV_COMPRESS=1 00:01:12.649 SPDK_RUN_UBSAN=1 00:01:12.649 SPDK_TEST_ACCEL=1 00:01:12.657 RUN_NIGHTLY=0 00:01:12.662 [Pipeline] readFile 00:01:12.688 [Pipeline] withEnv 00:01:12.690 [Pipeline] { 00:01:12.704 [Pipeline] sh 00:01:12.995 + set -ex 00:01:12.995 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:01:12.995 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:12.995 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:12.995 ++ SPDK_TEST_BLOCKDEV=1 00:01:12.995 ++ SPDK_TEST_ISAL=1 00:01:12.995 ++ SPDK_TEST_CRYPTO=1 00:01:12.995 ++ SPDK_TEST_REDUCE=1 00:01:12.995 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:12.995 ++ SPDK_RUN_UBSAN=1 00:01:12.995 ++ SPDK_TEST_ACCEL=1 00:01:12.995 ++ RUN_NIGHTLY=0 00:01:12.995 + case $SPDK_TEST_NVMF_NICS in 00:01:12.995 + DRIVERS= 00:01:12.995 + [[ -n '' ]] 00:01:12.995 + exit 0 00:01:13.006 [Pipeline] } 00:01:13.021 [Pipeline] // withEnv 00:01:13.027 [Pipeline] } 00:01:13.045 [Pipeline] // stage 00:01:13.054 [Pipeline] catchError 00:01:13.056 [Pipeline] { 00:01:13.070 [Pipeline] timeout 00:01:13.071 Timeout set to expire in 1 hr 0 min 00:01:13.073 [Pipeline] { 00:01:13.088 [Pipeline] stage 00:01:13.090 [Pipeline] { (Tests) 00:01:13.107 [Pipeline] sh 00:01:13.393 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:01:13.393 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:01:13.393 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:01:13.393 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:01:13.393 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:13.393 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:01:13.393 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:01:13.393 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:13.393 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:01:13.393 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:01:13.393 + [[ crypto-phy-autotest == pkgdep-* ]] 00:01:13.393 + cd /var/jenkins/workspace/crypto-phy-autotest 00:01:13.393 + source /etc/os-release 00:01:13.393 ++ NAME='Fedora Linux' 00:01:13.393 ++ VERSION='38 (Cloud Edition)' 00:01:13.393 ++ ID=fedora 00:01:13.393 ++ VERSION_ID=38 00:01:13.393 ++ VERSION_CODENAME= 00:01:13.393 ++ PLATFORM_ID=platform:f38 00:01:13.393 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:01:13.393 ++ ANSI_COLOR='0;38;2;60;110;180' 00:01:13.393 ++ LOGO=fedora-logo-icon 00:01:13.393 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:01:13.393 ++ HOME_URL=https://fedoraproject.org/ 00:01:13.393 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:01:13.393 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:01:13.393 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:01:13.393 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:01:13.393 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:01:13.393 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:01:13.393 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:01:13.393 ++ SUPPORT_END=2024-05-14 00:01:13.393 ++ VARIANT='Cloud Edition' 00:01:13.393 ++ VARIANT_ID=cloud 00:01:13.393 + uname -a 00:01:13.393 Linux spdk-CYP-06 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 00:59:40 UTC 2024 x86_64 GNU/Linux 00:01:13.393 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:01:16.698 Hugepages 00:01:16.698 node hugesize free / total 00:01:16.698 node0 1048576kB 0 / 0 00:01:16.698 node0 2048kB 0 / 0 00:01:16.698 node1 1048576kB 0 / 0 00:01:16.698 node1 2048kB 0 / 0 00:01:16.698 00:01:16.698 Type BDF Vendor Device NUMA Driver Device Block devices 00:01:16.698 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:01:16.698 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:01:16.960 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:01:16.960 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:01:16.960 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:01:16.960 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:01:16.960 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:01:16.960 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:01:16.960 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:01:16.960 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:01:16.960 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:01:16.960 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:01:16.960 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:01:16.960 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:01:16.960 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:01:16.960 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:01:16.960 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:01:16.960 + rm -f /tmp/spdk-ld-path 00:01:16.960 + source autorun-spdk.conf 00:01:16.960 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:01:16.960 ++ SPDK_TEST_BLOCKDEV=1 00:01:16.960 ++ SPDK_TEST_ISAL=1 00:01:16.960 ++ SPDK_TEST_CRYPTO=1 00:01:16.960 ++ SPDK_TEST_REDUCE=1 00:01:16.960 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:01:16.960 ++ SPDK_RUN_UBSAN=1 00:01:16.960 ++ SPDK_TEST_ACCEL=1 00:01:16.960 ++ RUN_NIGHTLY=0 00:01:16.960 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:01:16.960 + [[ -n '' ]] 00:01:16.960 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:16.960 + for M in /var/spdk/build-*-manifest.txt 00:01:16.960 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:01:16.960 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:16.960 + for M in /var/spdk/build-*-manifest.txt 00:01:16.960 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:01:16.960 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:01:16.960 ++ uname 00:01:16.960 + [[ Linux == \L\i\n\u\x ]] 00:01:16.960 + sudo dmesg -T 00:01:16.960 + sudo dmesg --clear 00:01:17.222 + dmesg_pid=687445 00:01:17.222 + [[ Fedora Linux == FreeBSD ]] 00:01:17.222 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:17.222 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:01:17.222 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:01:17.222 + [[ -x /usr/src/fio-static/fio ]] 00:01:17.222 + export FIO_BIN=/usr/src/fio-static/fio 00:01:17.222 + FIO_BIN=/usr/src/fio-static/fio 00:01:17.222 + sudo dmesg -Tw 00:01:17.222 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:01:17.222 + [[ ! -v VFIO_QEMU_BIN ]] 00:01:17.222 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:01:17.222 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:17.222 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:01:17.222 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:01:17.222 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:17.222 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:01:17.222 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:01:17.223 Test configuration: 00:01:17.223 SPDK_RUN_FUNCTIONAL_TEST=1 00:01:17.223 SPDK_TEST_BLOCKDEV=1 00:01:17.223 SPDK_TEST_ISAL=1 00:01:17.223 SPDK_TEST_CRYPTO=1 00:01:17.223 SPDK_TEST_REDUCE=1 00:01:17.223 SPDK_TEST_VBDEV_COMPRESS=1 00:01:17.223 SPDK_RUN_UBSAN=1 00:01:17.223 SPDK_TEST_ACCEL=1 00:01:17.223 RUN_NIGHTLY=0 13:09:57 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:01:17.223 13:09:57 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:01:17.223 13:09:57 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:01:17.223 13:09:57 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:01:17.223 13:09:57 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:17.223 13:09:57 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:17.223 13:09:57 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:17.223 13:09:57 -- paths/export.sh@5 -- $ export PATH 00:01:17.223 13:09:57 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:01:17.223 13:09:57 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:01:17.223 13:09:57 -- common/autobuild_common.sh@447 -- $ date +%s 00:01:17.223 13:09:57 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721905797.XXXXXX 00:01:17.223 13:09:57 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721905797.ZAxrOW 00:01:17.223 13:09:57 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:01:17.223 13:09:57 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:01:17.223 13:09:57 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:01:17.223 13:09:57 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:01:17.223 13:09:57 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:01:17.223 13:09:57 -- common/autobuild_common.sh@463 -- $ get_config_params 00:01:17.223 13:09:57 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:01:17.223 13:09:57 -- common/autotest_common.sh@10 -- $ set +x 00:01:17.223 13:09:57 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:01:17.223 13:09:57 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:01:17.223 13:09:57 -- pm/common@17 -- $ local monitor 00:01:17.223 13:09:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:17.223 13:09:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:17.223 13:09:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:17.223 13:09:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:01:17.223 13:09:57 -- pm/common@21 -- $ date +%s 00:01:17.223 13:09:57 -- pm/common@25 -- $ sleep 1 00:01:17.223 13:09:57 -- pm/common@21 -- $ date +%s 00:01:17.223 13:09:57 -- pm/common@21 -- $ date +%s 00:01:17.223 13:09:57 -- pm/common@21 -- $ date +%s 00:01:17.223 13:09:57 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721905797 00:01:17.223 13:09:57 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721905797 00:01:17.223 13:09:57 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721905797 00:01:17.223 13:09:57 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1721905797 00:01:17.223 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721905797_collect-vmstat.pm.log 00:01:17.223 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721905797_collect-cpu-load.pm.log 00:01:17.223 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721905797_collect-cpu-temp.pm.log 00:01:17.223 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1721905797_collect-bmc-pm.bmc.pm.log 00:01:18.166 13:09:58 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:01:18.166 13:09:58 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:01:18.166 13:09:58 -- spdk/autobuild.sh@12 -- $ umask 022 00:01:18.166 13:09:58 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:01:18.166 13:09:58 -- spdk/autobuild.sh@16 -- $ date -u 00:01:18.166 Thu Jul 25 11:09:58 AM UTC 2024 00:01:18.166 13:09:58 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:01:18.427 v24.09-pre-321-g704257090 00:01:18.427 13:09:58 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:01:18.427 13:09:58 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:01:18.427 13:09:58 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:01:18.427 13:09:58 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:18.427 13:09:58 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:18.427 13:09:58 -- common/autotest_common.sh@10 -- $ set +x 00:01:18.427 ************************************ 00:01:18.427 START TEST ubsan 00:01:18.427 ************************************ 00:01:18.427 13:09:59 ubsan -- common/autotest_common.sh@1125 -- $ echo 'using ubsan' 00:01:18.427 using ubsan 00:01:18.427 00:01:18.427 real 0m0.001s 00:01:18.427 user 0m0.000s 00:01:18.427 sys 0m0.000s 00:01:18.427 13:09:59 ubsan -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:01:18.427 13:09:59 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:01:18.427 ************************************ 00:01:18.427 END TEST ubsan 00:01:18.427 ************************************ 00:01:18.427 13:09:59 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:01:18.427 13:09:59 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:01:18.427 13:09:59 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:01:18.427 13:09:59 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:01:18.427 13:09:59 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:01:18.427 13:09:59 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:01:18.427 13:09:59 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:01:18.428 13:09:59 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:01:18.428 13:09:59 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:01:18.428 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:01:18.428 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:19.000 Using 'verbs' RDMA provider 00:01:35.291 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:47.524 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:47.524 Creating mk/config.mk...done. 00:01:47.524 Creating mk/cc.flags.mk...done. 00:01:47.524 Type 'make' to build. 00:01:47.524 13:10:28 -- spdk/autobuild.sh@69 -- $ run_test make make -j128 00:01:47.524 13:10:28 -- common/autotest_common.sh@1101 -- $ '[' 3 -le 1 ']' 00:01:47.524 13:10:28 -- common/autotest_common.sh@1107 -- $ xtrace_disable 00:01:47.524 13:10:28 -- common/autotest_common.sh@10 -- $ set +x 00:01:47.524 ************************************ 00:01:47.524 START TEST make 00:01:47.524 ************************************ 00:01:47.524 13:10:28 make -- common/autotest_common.sh@1125 -- $ make -j128 00:01:48.096 make[1]: Nothing to be done for 'all'. 00:02:20.197 The Meson build system 00:02:20.197 Version: 1.3.1 00:02:20.197 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:02:20.197 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:02:20.197 Build type: native build 00:02:20.198 Program cat found: YES (/usr/bin/cat) 00:02:20.198 Project name: DPDK 00:02:20.198 Project version: 24.03.0 00:02:20.198 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:02:20.198 C linker for the host machine: cc ld.bfd 2.39-16 00:02:20.198 Host machine cpu family: x86_64 00:02:20.198 Host machine cpu: x86_64 00:02:20.198 Message: ## Building in Developer Mode ## 00:02:20.198 Program pkg-config found: YES (/usr/bin/pkg-config) 00:02:20.198 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:02:20.198 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:02:20.198 Program python3 found: YES (/usr/bin/python3) 00:02:20.198 Program cat found: YES (/usr/bin/cat) 00:02:20.198 Compiler for C supports arguments -march=native: YES 00:02:20.198 Checking for size of "void *" : 8 00:02:20.198 Checking for size of "void *" : 8 (cached) 00:02:20.198 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:02:20.198 Library m found: YES 00:02:20.198 Library numa found: YES 00:02:20.198 Has header "numaif.h" : YES 00:02:20.198 Library fdt found: NO 00:02:20.198 Library execinfo found: NO 00:02:20.198 Has header "execinfo.h" : YES 00:02:20.198 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:02:20.198 Run-time dependency libarchive found: NO (tried pkgconfig) 00:02:20.198 Run-time dependency libbsd found: NO (tried pkgconfig) 00:02:20.198 Run-time dependency jansson found: NO (tried pkgconfig) 00:02:20.198 Run-time dependency openssl found: YES 3.0.9 00:02:20.198 Run-time dependency libpcap found: YES 1.10.4 00:02:20.198 Has header "pcap.h" with dependency libpcap: YES 00:02:20.198 Compiler for C supports arguments -Wcast-qual: YES 00:02:20.198 Compiler for C supports arguments -Wdeprecated: YES 00:02:20.198 Compiler for C supports arguments -Wformat: YES 00:02:20.198 Compiler for C supports arguments -Wformat-nonliteral: NO 00:02:20.198 Compiler for C supports arguments -Wformat-security: NO 00:02:20.198 Compiler for C supports arguments -Wmissing-declarations: YES 00:02:20.198 Compiler for C supports arguments -Wmissing-prototypes: YES 00:02:20.198 Compiler for C supports arguments -Wnested-externs: YES 00:02:20.198 Compiler for C supports arguments -Wold-style-definition: YES 00:02:20.198 Compiler for C supports arguments -Wpointer-arith: YES 00:02:20.198 Compiler for C supports arguments -Wsign-compare: YES 00:02:20.198 Compiler for C supports arguments -Wstrict-prototypes: YES 00:02:20.198 Compiler for C supports arguments -Wundef: YES 00:02:20.198 Compiler for C supports arguments -Wwrite-strings: YES 00:02:20.198 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:02:20.198 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:02:20.198 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:02:20.198 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:02:20.198 Program objdump found: YES (/usr/bin/objdump) 00:02:20.198 Compiler for C supports arguments -mavx512f: YES 00:02:20.198 Checking if "AVX512 checking" compiles: YES 00:02:20.198 Fetching value of define "__SSE4_2__" : 1 00:02:20.198 Fetching value of define "__AES__" : 1 00:02:20.198 Fetching value of define "__AVX__" : 1 00:02:20.198 Fetching value of define "__AVX2__" : 1 00:02:20.198 Fetching value of define "__AVX512BW__" : 1 00:02:20.198 Fetching value of define "__AVX512CD__" : 1 00:02:20.198 Fetching value of define "__AVX512DQ__" : 1 00:02:20.198 Fetching value of define "__AVX512F__" : 1 00:02:20.198 Fetching value of define "__AVX512VL__" : 1 00:02:20.198 Fetching value of define "__PCLMUL__" : 1 00:02:20.198 Fetching value of define "__RDRND__" : 1 00:02:20.198 Fetching value of define "__RDSEED__" : 1 00:02:20.198 Fetching value of define "__VPCLMULQDQ__" : 1 00:02:20.198 Fetching value of define "__znver1__" : (undefined) 00:02:20.198 Fetching value of define "__znver2__" : (undefined) 00:02:20.198 Fetching value of define "__znver3__" : (undefined) 00:02:20.198 Fetching value of define "__znver4__" : (undefined) 00:02:20.198 Compiler for C supports arguments -Wno-format-truncation: YES 00:02:20.198 Message: lib/log: Defining dependency "log" 00:02:20.198 Message: lib/kvargs: Defining dependency "kvargs" 00:02:20.198 Message: lib/telemetry: Defining dependency "telemetry" 00:02:20.198 Checking for function "getentropy" : NO 00:02:20.198 Message: lib/eal: Defining dependency "eal" 00:02:20.198 Message: lib/ring: Defining dependency "ring" 00:02:20.198 Message: lib/rcu: Defining dependency "rcu" 00:02:20.198 Message: lib/mempool: Defining dependency "mempool" 00:02:20.198 Message: lib/mbuf: Defining dependency "mbuf" 00:02:20.198 Fetching value of define "__PCLMUL__" : 1 (cached) 00:02:20.198 Fetching value of define "__AVX512F__" : 1 (cached) 00:02:20.198 Fetching value of define "__AVX512BW__" : 1 (cached) 00:02:20.198 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:02:20.198 Fetching value of define "__AVX512VL__" : 1 (cached) 00:02:20.198 Fetching value of define "__VPCLMULQDQ__" : 1 (cached) 00:02:20.198 Compiler for C supports arguments -mpclmul: YES 00:02:20.198 Compiler for C supports arguments -maes: YES 00:02:20.198 Compiler for C supports arguments -mavx512f: YES (cached) 00:02:20.198 Compiler for C supports arguments -mavx512bw: YES 00:02:20.198 Compiler for C supports arguments -mavx512dq: YES 00:02:20.198 Compiler for C supports arguments -mavx512vl: YES 00:02:20.198 Compiler for C supports arguments -mvpclmulqdq: YES 00:02:20.198 Compiler for C supports arguments -mavx2: YES 00:02:20.198 Compiler for C supports arguments -mavx: YES 00:02:20.198 Message: lib/net: Defining dependency "net" 00:02:20.198 Message: lib/meter: Defining dependency "meter" 00:02:20.198 Message: lib/ethdev: Defining dependency "ethdev" 00:02:20.198 Message: lib/pci: Defining dependency "pci" 00:02:20.198 Message: lib/cmdline: Defining dependency "cmdline" 00:02:20.198 Message: lib/hash: Defining dependency "hash" 00:02:20.198 Message: lib/timer: Defining dependency "timer" 00:02:20.198 Message: lib/compressdev: Defining dependency "compressdev" 00:02:20.198 Message: lib/cryptodev: Defining dependency "cryptodev" 00:02:20.198 Message: lib/dmadev: Defining dependency "dmadev" 00:02:20.198 Compiler for C supports arguments -Wno-cast-qual: YES 00:02:20.198 Message: lib/power: Defining dependency "power" 00:02:20.198 Message: lib/reorder: Defining dependency "reorder" 00:02:20.198 Message: lib/security: Defining dependency "security" 00:02:20.198 Has header "linux/userfaultfd.h" : YES 00:02:20.198 Has header "linux/vduse.h" : YES 00:02:20.198 Message: lib/vhost: Defining dependency "vhost" 00:02:20.198 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:02:20.198 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:02:20.198 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:02:20.198 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:02:20.198 Compiler for C supports arguments -std=c11: YES 00:02:20.198 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:02:20.198 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:02:20.198 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:02:20.198 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:02:20.198 Run-time dependency libmlx5 found: YES 1.24.44.0 00:02:20.198 Run-time dependency libibverbs found: YES 1.14.44.0 00:02:20.198 Library mtcr_ul found: NO 00:02:20.198 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:02:20.198 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:02:21.141 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:02:21.141 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:02:21.141 Configuring mlx5_autoconf.h using configuration 00:02:21.141 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:02:21.141 Run-time dependency libcrypto found: YES 3.0.9 00:02:21.141 Library IPSec_MB found: YES 00:02:21.141 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:02:21.141 Message: drivers/common/qat: Defining dependency "common_qat" 00:02:21.141 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:02:21.141 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:02:21.141 Library IPSec_MB found: YES 00:02:21.141 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:02:21.141 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:02:21.141 Compiler for C supports arguments -std=c11: YES (cached) 00:02:21.141 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:21.141 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:21.141 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:21.141 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:21.141 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:02:21.141 Run-time dependency libisal found: NO (tried pkgconfig) 00:02:21.141 Library libisal found: NO 00:02:21.141 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:02:21.141 Compiler for C supports arguments -std=c11: YES (cached) 00:02:21.141 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:02:21.141 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:02:21.141 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:02:21.141 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:02:21.141 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:02:21.141 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:02:21.141 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:02:21.141 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:02:21.141 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:02:21.141 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:02:21.141 Program doxygen found: YES (/usr/bin/doxygen) 00:02:21.141 Configuring doxy-api-html.conf using configuration 00:02:21.141 Configuring doxy-api-man.conf using configuration 00:02:21.141 Program mandb found: YES (/usr/bin/mandb) 00:02:21.141 Program sphinx-build found: NO 00:02:21.141 Configuring rte_build_config.h using configuration 00:02:21.141 Message: 00:02:21.141 ================= 00:02:21.141 Applications Enabled 00:02:21.141 ================= 00:02:21.141 00:02:21.141 apps: 00:02:21.141 00:02:21.141 00:02:21.141 Message: 00:02:21.141 ================= 00:02:21.141 Libraries Enabled 00:02:21.141 ================= 00:02:21.141 00:02:21.141 libs: 00:02:21.141 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:02:21.141 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:02:21.141 cryptodev, dmadev, power, reorder, security, vhost, 00:02:21.141 00:02:21.141 Message: 00:02:21.141 =============== 00:02:21.141 Drivers Enabled 00:02:21.141 =============== 00:02:21.141 00:02:21.141 common: 00:02:21.141 mlx5, qat, 00:02:21.141 bus: 00:02:21.141 auxiliary, pci, vdev, 00:02:21.141 mempool: 00:02:21.141 ring, 00:02:21.141 dma: 00:02:21.141 00:02:21.141 net: 00:02:21.141 00:02:21.141 crypto: 00:02:21.141 ipsec_mb, mlx5, 00:02:21.141 compress: 00:02:21.141 isal, mlx5, 00:02:21.141 vdpa: 00:02:21.141 00:02:21.141 00:02:21.141 Message: 00:02:21.141 ================= 00:02:21.141 Content Skipped 00:02:21.141 ================= 00:02:21.141 00:02:21.141 apps: 00:02:21.141 dumpcap: explicitly disabled via build config 00:02:21.141 graph: explicitly disabled via build config 00:02:21.141 pdump: explicitly disabled via build config 00:02:21.141 proc-info: explicitly disabled via build config 00:02:21.141 test-acl: explicitly disabled via build config 00:02:21.141 test-bbdev: explicitly disabled via build config 00:02:21.141 test-cmdline: explicitly disabled via build config 00:02:21.141 test-compress-perf: explicitly disabled via build config 00:02:21.141 test-crypto-perf: explicitly disabled via build config 00:02:21.141 test-dma-perf: explicitly disabled via build config 00:02:21.141 test-eventdev: explicitly disabled via build config 00:02:21.141 test-fib: explicitly disabled via build config 00:02:21.141 test-flow-perf: explicitly disabled via build config 00:02:21.141 test-gpudev: explicitly disabled via build config 00:02:21.142 test-mldev: explicitly disabled via build config 00:02:21.142 test-pipeline: explicitly disabled via build config 00:02:21.142 test-pmd: explicitly disabled via build config 00:02:21.142 test-regex: explicitly disabled via build config 00:02:21.142 test-sad: explicitly disabled via build config 00:02:21.142 test-security-perf: explicitly disabled via build config 00:02:21.142 00:02:21.142 libs: 00:02:21.142 argparse: explicitly disabled via build config 00:02:21.142 metrics: explicitly disabled via build config 00:02:21.142 acl: explicitly disabled via build config 00:02:21.142 bbdev: explicitly disabled via build config 00:02:21.142 bitratestats: explicitly disabled via build config 00:02:21.142 bpf: explicitly disabled via build config 00:02:21.142 cfgfile: explicitly disabled via build config 00:02:21.142 distributor: explicitly disabled via build config 00:02:21.142 efd: explicitly disabled via build config 00:02:21.142 eventdev: explicitly disabled via build config 00:02:21.142 dispatcher: explicitly disabled via build config 00:02:21.142 gpudev: explicitly disabled via build config 00:02:21.142 gro: explicitly disabled via build config 00:02:21.142 gso: explicitly disabled via build config 00:02:21.142 ip_frag: explicitly disabled via build config 00:02:21.142 jobstats: explicitly disabled via build config 00:02:21.142 latencystats: explicitly disabled via build config 00:02:21.142 lpm: explicitly disabled via build config 00:02:21.142 member: explicitly disabled via build config 00:02:21.142 pcapng: explicitly disabled via build config 00:02:21.142 rawdev: explicitly disabled via build config 00:02:21.142 regexdev: explicitly disabled via build config 00:02:21.142 mldev: explicitly disabled via build config 00:02:21.142 rib: explicitly disabled via build config 00:02:21.142 sched: explicitly disabled via build config 00:02:21.142 stack: explicitly disabled via build config 00:02:21.142 ipsec: explicitly disabled via build config 00:02:21.142 pdcp: explicitly disabled via build config 00:02:21.142 fib: explicitly disabled via build config 00:02:21.142 port: explicitly disabled via build config 00:02:21.142 pdump: explicitly disabled via build config 00:02:21.142 table: explicitly disabled via build config 00:02:21.142 pipeline: explicitly disabled via build config 00:02:21.142 graph: explicitly disabled via build config 00:02:21.142 node: explicitly disabled via build config 00:02:21.142 00:02:21.142 drivers: 00:02:21.142 common/cpt: not in enabled drivers build config 00:02:21.142 common/dpaax: not in enabled drivers build config 00:02:21.142 common/iavf: not in enabled drivers build config 00:02:21.142 common/idpf: not in enabled drivers build config 00:02:21.142 common/ionic: not in enabled drivers build config 00:02:21.142 common/mvep: not in enabled drivers build config 00:02:21.142 common/octeontx: not in enabled drivers build config 00:02:21.142 bus/cdx: not in enabled drivers build config 00:02:21.142 bus/dpaa: not in enabled drivers build config 00:02:21.142 bus/fslmc: not in enabled drivers build config 00:02:21.142 bus/ifpga: not in enabled drivers build config 00:02:21.142 bus/platform: not in enabled drivers build config 00:02:21.142 bus/uacce: not in enabled drivers build config 00:02:21.142 bus/vmbus: not in enabled drivers build config 00:02:21.142 common/cnxk: not in enabled drivers build config 00:02:21.142 common/nfp: not in enabled drivers build config 00:02:21.142 common/nitrox: not in enabled drivers build config 00:02:21.142 common/sfc_efx: not in enabled drivers build config 00:02:21.142 mempool/bucket: not in enabled drivers build config 00:02:21.142 mempool/cnxk: not in enabled drivers build config 00:02:21.142 mempool/dpaa: not in enabled drivers build config 00:02:21.142 mempool/dpaa2: not in enabled drivers build config 00:02:21.142 mempool/octeontx: not in enabled drivers build config 00:02:21.142 mempool/stack: not in enabled drivers build config 00:02:21.142 dma/cnxk: not in enabled drivers build config 00:02:21.142 dma/dpaa: not in enabled drivers build config 00:02:21.142 dma/dpaa2: not in enabled drivers build config 00:02:21.142 dma/hisilicon: not in enabled drivers build config 00:02:21.142 dma/idxd: not in enabled drivers build config 00:02:21.142 dma/ioat: not in enabled drivers build config 00:02:21.142 dma/skeleton: not in enabled drivers build config 00:02:21.142 net/af_packet: not in enabled drivers build config 00:02:21.142 net/af_xdp: not in enabled drivers build config 00:02:21.142 net/ark: not in enabled drivers build config 00:02:21.142 net/atlantic: not in enabled drivers build config 00:02:21.142 net/avp: not in enabled drivers build config 00:02:21.142 net/axgbe: not in enabled drivers build config 00:02:21.142 net/bnx2x: not in enabled drivers build config 00:02:21.142 net/bnxt: not in enabled drivers build config 00:02:21.142 net/bonding: not in enabled drivers build config 00:02:21.142 net/cnxk: not in enabled drivers build config 00:02:21.142 net/cpfl: not in enabled drivers build config 00:02:21.142 net/cxgbe: not in enabled drivers build config 00:02:21.142 net/dpaa: not in enabled drivers build config 00:02:21.142 net/dpaa2: not in enabled drivers build config 00:02:21.142 net/e1000: not in enabled drivers build config 00:02:21.142 net/ena: not in enabled drivers build config 00:02:21.142 net/enetc: not in enabled drivers build config 00:02:21.142 net/enetfec: not in enabled drivers build config 00:02:21.142 net/enic: not in enabled drivers build config 00:02:21.142 net/failsafe: not in enabled drivers build config 00:02:21.142 net/fm10k: not in enabled drivers build config 00:02:21.142 net/gve: not in enabled drivers build config 00:02:21.142 net/hinic: not in enabled drivers build config 00:02:21.142 net/hns3: not in enabled drivers build config 00:02:21.142 net/i40e: not in enabled drivers build config 00:02:21.142 net/iavf: not in enabled drivers build config 00:02:21.142 net/ice: not in enabled drivers build config 00:02:21.142 net/idpf: not in enabled drivers build config 00:02:21.142 net/igc: not in enabled drivers build config 00:02:21.142 net/ionic: not in enabled drivers build config 00:02:21.142 net/ipn3ke: not in enabled drivers build config 00:02:21.142 net/ixgbe: not in enabled drivers build config 00:02:21.142 net/mana: not in enabled drivers build config 00:02:21.142 net/memif: not in enabled drivers build config 00:02:21.142 net/mlx4: not in enabled drivers build config 00:02:21.142 net/mlx5: not in enabled drivers build config 00:02:21.142 net/mvneta: not in enabled drivers build config 00:02:21.142 net/mvpp2: not in enabled drivers build config 00:02:21.142 net/netvsc: not in enabled drivers build config 00:02:21.142 net/nfb: not in enabled drivers build config 00:02:21.142 net/nfp: not in enabled drivers build config 00:02:21.142 net/ngbe: not in enabled drivers build config 00:02:21.142 net/null: not in enabled drivers build config 00:02:21.142 net/octeontx: not in enabled drivers build config 00:02:21.142 net/octeon_ep: not in enabled drivers build config 00:02:21.142 net/pcap: not in enabled drivers build config 00:02:21.142 net/pfe: not in enabled drivers build config 00:02:21.142 net/qede: not in enabled drivers build config 00:02:21.142 net/ring: not in enabled drivers build config 00:02:21.142 net/sfc: not in enabled drivers build config 00:02:21.142 net/softnic: not in enabled drivers build config 00:02:21.142 net/tap: not in enabled drivers build config 00:02:21.142 net/thunderx: not in enabled drivers build config 00:02:21.142 net/txgbe: not in enabled drivers build config 00:02:21.142 net/vdev_netvsc: not in enabled drivers build config 00:02:21.142 net/vhost: not in enabled drivers build config 00:02:21.142 net/virtio: not in enabled drivers build config 00:02:21.142 net/vmxnet3: not in enabled drivers build config 00:02:21.142 raw/*: missing internal dependency, "rawdev" 00:02:21.142 crypto/armv8: not in enabled drivers build config 00:02:21.142 crypto/bcmfs: not in enabled drivers build config 00:02:21.142 crypto/caam_jr: not in enabled drivers build config 00:02:21.142 crypto/ccp: not in enabled drivers build config 00:02:21.142 crypto/cnxk: not in enabled drivers build config 00:02:21.142 crypto/dpaa_sec: not in enabled drivers build config 00:02:21.142 crypto/dpaa2_sec: not in enabled drivers build config 00:02:21.142 crypto/mvsam: not in enabled drivers build config 00:02:21.142 crypto/nitrox: not in enabled drivers build config 00:02:21.142 crypto/null: not in enabled drivers build config 00:02:21.142 crypto/octeontx: not in enabled drivers build config 00:02:21.142 crypto/openssl: not in enabled drivers build config 00:02:21.142 crypto/scheduler: not in enabled drivers build config 00:02:21.142 crypto/uadk: not in enabled drivers build config 00:02:21.142 crypto/virtio: not in enabled drivers build config 00:02:21.142 compress/nitrox: not in enabled drivers build config 00:02:21.142 compress/octeontx: not in enabled drivers build config 00:02:21.142 compress/zlib: not in enabled drivers build config 00:02:21.142 regex/*: missing internal dependency, "regexdev" 00:02:21.142 ml/*: missing internal dependency, "mldev" 00:02:21.142 vdpa/ifc: not in enabled drivers build config 00:02:21.142 vdpa/mlx5: not in enabled drivers build config 00:02:21.142 vdpa/nfp: not in enabled drivers build config 00:02:21.142 vdpa/sfc: not in enabled drivers build config 00:02:21.142 event/*: missing internal dependency, "eventdev" 00:02:21.142 baseband/*: missing internal dependency, "bbdev" 00:02:21.142 gpu/*: missing internal dependency, "gpudev" 00:02:21.142 00:02:21.142 00:02:21.403 Build targets in project: 114 00:02:21.403 00:02:21.403 DPDK 24.03.0 00:02:21.403 00:02:21.403 User defined options 00:02:21.403 buildtype : debug 00:02:21.403 default_library : shared 00:02:21.403 libdir : lib 00:02:21.403 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:02:21.403 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:02:21.403 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:02:21.403 cpu_instruction_set: native 00:02:21.403 disable_apps : dumpcap,graph,pdump,proc-info,test-acl,test-bbdev,test-cmdline,test-compress-perf,test-crypto-perf,test-dma-perf,test-eventdev,test-fib,test-flow-perf,test-gpudev,test-mldev,test-pipeline,test-pmd,test-regex,test-sad,test-security-perf,test 00:02:21.403 disable_libs : acl,argparse,bbdev,bitratestats,bpf,cfgfile,dispatcher,distributor,efd,eventdev,fib,gpudev,graph,gro,gso,ip_frag,ipsec,jobstats,latencystats,lpm,member,metrics,mldev,node,pcapng,pdcp,pdump,pipeline,port,rawdev,regexdev,rib,sched,stack,table 00:02:21.403 enable_docs : false 00:02:21.403 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:02:21.403 enable_kmods : false 00:02:21.403 max_lcores : 128 00:02:21.403 tests : false 00:02:21.403 00:02:21.403 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:02:21.990 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:02:21.990 [1/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:02:21.990 [2/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:02:21.990 [3/377] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:02:21.990 [4/377] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:02:21.990 [5/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:02:21.990 [6/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:02:21.990 [7/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:02:21.990 [8/377] Linking static target lib/librte_kvargs.a 00:02:21.990 [9/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:02:22.267 [10/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:02:22.267 [11/377] Compiling C object lib/librte_log.a.p/log_log.c.o 00:02:22.267 [12/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:02:22.267 [13/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:02:22.267 [14/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:02:22.267 [15/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:02:22.267 [16/377] Linking static target lib/librte_log.a 00:02:22.267 [17/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:02:22.267 [18/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:02:22.267 [19/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:02:22.267 [20/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:02:22.267 [21/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:02:22.530 [22/377] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:02:22.530 [23/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:02:22.530 [24/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:02:22.530 [25/377] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:02:22.530 [26/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:02:22.530 [27/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:02:22.530 [28/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:02:22.530 [29/377] Linking static target lib/librte_pci.a 00:02:22.530 [30/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:02:22.530 [31/377] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:02:22.530 [32/377] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:02:22.530 [33/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:02:22.530 [34/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:02:22.530 [35/377] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:02:22.789 [36/377] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:02:22.789 [37/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:02:22.789 [38/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:02:22.789 [39/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:02:22.789 [40/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:02:23.054 [41/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:02:23.054 [42/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:02:23.054 [43/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:02:23.054 [44/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:02:23.054 [45/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:02:23.054 [46/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:02:23.054 [47/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:02:23.054 [48/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:02:23.054 [49/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:02:23.054 [50/377] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:02:23.054 [51/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:02:23.054 [52/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:02:23.054 [53/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:02:23.054 [54/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:02:23.054 [55/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:02:23.054 [56/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:02:23.054 [57/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:02:23.054 [58/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:02:23.054 [59/377] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:02:23.054 [60/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:02:23.054 [61/377] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:02:23.054 [62/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:02:23.054 [63/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:02:23.054 [64/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:02:23.054 [65/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:02:23.054 [66/377] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:02:23.054 [67/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:02:23.054 [68/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:02:23.054 [69/377] Compiling C object lib/librte_net.a.p/net_net_crc_avx512.c.o 00:02:23.054 [70/377] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:02:23.054 [71/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:02:23.054 [72/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:02:23.054 [73/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:02:23.054 [74/377] Linking static target lib/librte_telemetry.a 00:02:23.054 [75/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:02:23.054 [76/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:02:23.054 [77/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:02:23.054 [78/377] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:02:23.054 [79/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:02:23.054 [80/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:02:23.054 [81/377] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:02:23.054 [82/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:02:23.054 [83/377] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:02:23.054 [84/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:02:23.054 [85/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:02:23.054 [86/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:02:23.054 [87/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:02:23.054 [88/377] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:02:23.054 [89/377] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:02:23.054 [90/377] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:02:23.054 [91/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:02:23.054 [92/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:02:23.315 [93/377] Linking static target lib/librte_ring.a 00:02:23.315 [94/377] Linking static target lib/librte_meter.a 00:02:23.315 [95/377] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:02:23.315 [96/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:02:23.315 [97/377] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:02:23.315 [98/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:02:23.315 [99/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:02:23.315 [100/377] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:02:23.315 [101/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:02:23.315 [102/377] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.315 [103/377] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:02:23.315 [104/377] Linking static target lib/librte_timer.a 00:02:23.315 [105/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:02:23.315 [106/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:02:23.315 [107/377] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.315 [108/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:02:23.316 [109/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:02:23.316 [110/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:02:23.316 [111/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:02:23.316 [112/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:02:23.316 [113/377] Linking static target lib/librte_cmdline.a 00:02:23.316 [114/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:02:23.316 [115/377] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:02:23.316 [116/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:02:23.316 [117/377] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:02:23.316 [118/377] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:02:23.316 [119/377] Linking static target lib/librte_mempool.a 00:02:23.316 [120/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:02:23.316 [121/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:02:23.316 [122/377] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:02:23.316 [123/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:02:23.316 [124/377] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:02:23.316 [125/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:02:23.316 [126/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:02:23.316 [127/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:02:23.316 [128/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:02:23.316 [129/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:02:23.316 [130/377] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:02:23.577 [131/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:02:23.577 [132/377] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:02:23.577 [133/377] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:02:23.577 [134/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:02:23.577 [135/377] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:02:23.577 [136/377] Linking static target lib/librte_compressdev.a 00:02:23.578 [137/377] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:02:23.578 [138/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:02:23.578 [139/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:02:23.578 [140/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:02:23.578 [141/377] Linking static target lib/librte_rcu.a 00:02:23.578 [142/377] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:02:23.578 [143/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:02:23.578 [144/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:02:23.578 [145/377] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:02:23.578 [146/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:02:23.578 [147/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:02:23.578 [148/377] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:02:23.578 [149/377] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:02:23.578 [150/377] Linking static target lib/librte_mbuf.a 00:02:23.578 [151/377] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:02:23.578 [152/377] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:02:23.578 [153/377] Linking static target lib/librte_hash.a 00:02:23.578 [154/377] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:02:23.578 [155/377] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:02:23.578 [156/377] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:02:23.578 [157/377] Linking static target lib/librte_security.a 00:02:23.837 [158/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:02:23.837 [159/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:02:23.837 [160/377] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:02:23.837 [161/377] Linking static target lib/librte_power.a 00:02:23.837 [162/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:02:23.837 [163/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:02:23.837 [164/377] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:02:23.837 [165/377] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.837 [166/377] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.837 [167/377] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:02:23.837 [168/377] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.837 [169/377] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:02:23.837 [170/377] Linking static target drivers/libtmp_rte_bus_vdev.a 00:02:23.837 [171/377] Linking static target lib/librte_reorder.a 00:02:23.837 [172/377] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:02:23.837 [173/377] Linking target lib/librte_log.so.24.1 00:02:23.837 [174/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:02:23.837 [175/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:02:23.837 [176/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:02:23.837 [177/377] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:02:23.837 [178/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:02:23.837 [179/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:02:23.837 [180/377] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:02:23.837 [181/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:02:23.837 [182/377] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:02:23.837 [183/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:02:23.837 [184/377] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.837 [185/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:02:23.837 [186/377] Linking static target lib/librte_eal.a 00:02:23.837 [187/377] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:02:23.837 [188/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:02:23.837 [189/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:02:23.837 [190/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:02:23.837 [191/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:02:23.837 [192/377] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.837 [193/377] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:23.837 [194/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:02:23.837 [195/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:02:23.837 [196/377] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:02:23.837 [197/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:02:23.837 [198/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:02:23.837 [199/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:02:23.837 [200/377] Linking static target drivers/librte_bus_auxiliary.a 00:02:23.837 [201/377] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:02:23.837 [202/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:02:23.837 [203/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:02:23.837 [204/377] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:02:23.837 [205/377] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:02:23.837 [206/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:02:23.837 [207/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:02:23.837 [208/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:02:23.837 [209/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:02:23.837 [210/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:02:23.837 [211/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:02:23.837 [212/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:02:23.837 [213/377] Linking target lib/librte_kvargs.so.24.1 00:02:23.838 [214/377] Linking target lib/librte_telemetry.so.24.1 00:02:23.838 [215/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:02:23.838 [216/377] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:02:24.097 [217/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:02:24.097 [218/377] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:24.097 [219/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:02:24.097 [220/377] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:02:24.097 [221/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:02:24.097 [222/377] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:02:24.097 [223/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:02:24.097 [224/377] Linking static target drivers/librte_bus_vdev.a 00:02:24.097 [225/377] Linking static target lib/librte_net.a 00:02:24.097 [226/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:02:24.097 [227/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:02:24.097 [228/377] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:02:24.097 [229/377] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:02:24.097 [230/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:02:24.097 [231/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:02:24.097 [232/377] Linking static target drivers/libtmp_rte_mempool_ring.a 00:02:24.097 [233/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:02:24.097 [234/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:02:24.097 [235/377] Linking static target drivers/libtmp_rte_bus_pci.a 00:02:24.097 [236/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:02:24.097 [237/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:02:24.097 [238/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:02:24.097 [239/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:02:24.097 [240/377] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:02:24.097 [241/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:02:24.097 [242/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:02:24.097 [243/377] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.097 [244/377] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:02:24.097 [245/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:02:24.097 [246/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:02:24.097 [247/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:02:24.097 [248/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:02:24.097 [249/377] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:02:24.097 [250/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:02:24.097 [251/377] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:02:24.097 [252/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:02:24.097 [253/377] Linking static target lib/librte_dmadev.a 00:02:24.097 [254/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:02:24.097 [255/377] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.097 [256/377] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.097 [257/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:02:24.097 [258/377] Linking static target lib/librte_cryptodev.a 00:02:24.097 [259/377] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:02:24.098 [260/377] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.098 [261/377] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:02:24.098 [262/377] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:02:24.098 [263/377] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:02:24.098 [264/377] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.098 [265/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:02:24.358 [266/377] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:02:24.358 [267/377] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:02:24.358 [268/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:02:24.358 [269/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:02:24.358 [270/377] Linking static target drivers/libtmp_rte_compress_isal.a 00:02:24.358 [271/377] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:24.358 [272/377] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:02:24.358 [273/377] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:02:24.358 [274/377] Linking static target drivers/librte_mempool_ring.a 00:02:24.358 [275/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:02:24.358 [276/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:02:24.358 [277/377] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:24.358 [278/377] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:02:24.358 [279/377] Linking static target drivers/librte_bus_pci.a 00:02:24.358 [280/377] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:02:24.358 [281/377] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.358 [282/377] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:02:24.358 [283/377] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.358 [284/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:02:24.358 [285/377] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:02:24.358 [286/377] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:02:24.358 [287/377] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:24.358 [288/377] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:02:24.358 [289/377] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:24.358 [290/377] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:02:24.358 [291/377] Linking static target drivers/librte_compress_mlx5.a 00:02:24.358 [292/377] Linking static target drivers/librte_crypto_mlx5.a 00:02:24.358 [293/377] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.358 [294/377] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.358 [295/377] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:02:24.619 [296/377] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:24.619 [297/377] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:02:24.619 [298/377] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.619 [299/377] Linking static target drivers/librte_compress_isal.a 00:02:24.619 [300/377] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.619 [301/377] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:02:24.619 [302/377] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:02:24.619 [303/377] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:02:24.619 [304/377] Linking static target drivers/libtmp_rte_common_mlx5.a 00:02:24.619 [305/377] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:24.619 [306/377] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:02:24.619 [307/377] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:02:24.619 [308/377] Linking static target drivers/librte_crypto_ipsec_mb.a 00:02:24.619 [309/377] Linking static target lib/librte_ethdev.a 00:02:24.619 [310/377] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:02:24.880 [311/377] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:02:24.880 [312/377] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:24.880 [313/377] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:24.880 [314/377] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:02:24.880 [315/377] Linking static target drivers/librte_common_mlx5.a 00:02:25.141 [316/377] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:02:25.141 [317/377] Linking static target drivers/libtmp_rte_common_qat.a 00:02:25.141 [318/377] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:02:25.417 [319/377] Generating drivers/rte_common_qat.pmd.c with a custom command 00:02:25.417 [320/377] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:25.417 [321/377] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:02:25.417 [322/377] Linking static target drivers/librte_common_qat.a 00:02:25.726 [323/377] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:02:25.726 [324/377] Linking static target lib/librte_vhost.a 00:02:26.320 [325/377] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:27.708 [326/377] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:02:31.012 [327/377] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:02:34.317 [328/377] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.699 [329/377] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:02:35.699 [330/377] Linking target lib/librte_eal.so.24.1 00:02:35.699 [331/377] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:02:35.959 [332/377] Linking target lib/librte_meter.so.24.1 00:02:35.959 [333/377] Linking target lib/librte_ring.so.24.1 00:02:35.959 [334/377] Linking target lib/librte_timer.so.24.1 00:02:35.959 [335/377] Linking target lib/librte_pci.so.24.1 00:02:35.959 [336/377] Linking target lib/librte_dmadev.so.24.1 00:02:35.959 [337/377] Linking target drivers/librte_bus_auxiliary.so.24.1 00:02:35.959 [338/377] Linking target drivers/librte_bus_vdev.so.24.1 00:02:35.959 [339/377] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:02:35.959 [340/377] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:02:35.959 [341/377] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:02:35.959 [342/377] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:02:35.959 [343/377] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:02:35.959 [344/377] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:02:35.959 [345/377] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:02:36.220 [346/377] Linking target lib/librte_rcu.so.24.1 00:02:36.220 [347/377] Linking target lib/librte_mempool.so.24.1 00:02:36.220 [348/377] Linking target drivers/librte_bus_pci.so.24.1 00:02:36.220 [349/377] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:02:36.220 [350/377] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:02:36.220 [351/377] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:02:36.220 [352/377] Linking target drivers/librte_mempool_ring.so.24.1 00:02:36.220 [353/377] Linking target lib/librte_mbuf.so.24.1 00:02:36.481 [354/377] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:02:36.481 [355/377] Linking target lib/librte_reorder.so.24.1 00:02:36.481 [356/377] Linking target lib/librte_compressdev.so.24.1 00:02:36.481 [357/377] Linking target lib/librte_net.so.24.1 00:02:36.481 [358/377] Linking target lib/librte_cryptodev.so.24.1 00:02:36.741 [359/377] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:02:36.741 [360/377] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:02:36.741 [361/377] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:02:36.741 [362/377] Linking target drivers/librte_compress_isal.so.24.1 00:02:36.741 [363/377] Linking target lib/librte_hash.so.24.1 00:02:36.741 [364/377] Linking target lib/librte_security.so.24.1 00:02:36.741 [365/377] Linking target lib/librte_cmdline.so.24.1 00:02:36.741 [366/377] Linking target lib/librte_ethdev.so.24.1 00:02:37.002 [367/377] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:02:37.002 [368/377] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:02:37.002 [369/377] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:02:37.002 [370/377] Linking target drivers/librte_common_mlx5.so.24.1 00:02:37.002 [371/377] Linking target lib/librte_power.so.24.1 00:02:37.002 [372/377] Linking target lib/librte_vhost.so.24.1 00:02:37.262 [373/377] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:02:37.262 [374/377] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:02:37.262 [375/377] Linking target drivers/librte_common_qat.so.24.1 00:02:37.262 [376/377] Linking target drivers/librte_compress_mlx5.so.24.1 00:02:37.262 [377/377] Linking target drivers/librte_crypto_mlx5.so.24.1 00:02:37.262 INFO: autodetecting backend as ninja 00:02:37.262 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 128 00:02:38.646 CC lib/log/log.o 00:02:38.646 CC lib/log/log_flags.o 00:02:38.646 CC lib/log/log_deprecated.o 00:02:38.646 CC lib/ut_mock/mock.o 00:02:38.646 CC lib/ut/ut.o 00:02:38.646 LIB libspdk_log.a 00:02:38.646 LIB libspdk_ut_mock.a 00:02:38.646 LIB libspdk_ut.a 00:02:38.646 SO libspdk_log.so.7.0 00:02:38.646 SO libspdk_ut.so.2.0 00:02:38.646 SO libspdk_ut_mock.so.6.0 00:02:38.646 SYMLINK libspdk_ut.so 00:02:38.646 SYMLINK libspdk_log.so 00:02:38.646 SYMLINK libspdk_ut_mock.so 00:02:39.217 CXX lib/trace_parser/trace.o 00:02:39.217 CC lib/util/base64.o 00:02:39.217 CC lib/util/bit_array.o 00:02:39.217 CC lib/util/cpuset.o 00:02:39.217 CC lib/util/crc16.o 00:02:39.217 CC lib/util/crc32.o 00:02:39.217 CC lib/dma/dma.o 00:02:39.217 CC lib/util/crc32c.o 00:02:39.217 CC lib/util/crc32_ieee.o 00:02:39.217 CC lib/util/crc64.o 00:02:39.217 CC lib/util/dif.o 00:02:39.217 CC lib/util/fd.o 00:02:39.217 CC lib/ioat/ioat.o 00:02:39.217 CC lib/util/fd_group.o 00:02:39.217 CC lib/util/file.o 00:02:39.217 CC lib/util/hexlify.o 00:02:39.217 CC lib/util/iov.o 00:02:39.217 CC lib/util/math.o 00:02:39.217 CC lib/util/net.o 00:02:39.217 CC lib/util/pipe.o 00:02:39.217 CC lib/util/strerror_tls.o 00:02:39.217 CC lib/util/string.o 00:02:39.217 CC lib/util/uuid.o 00:02:39.217 CC lib/util/xor.o 00:02:39.217 CC lib/util/zipf.o 00:02:39.217 CC lib/vfio_user/host/vfio_user_pci.o 00:02:39.217 CC lib/vfio_user/host/vfio_user.o 00:02:39.217 LIB libspdk_dma.a 00:02:39.477 SO libspdk_dma.so.4.0 00:02:39.478 LIB libspdk_ioat.a 00:02:39.478 SO libspdk_ioat.so.7.0 00:02:39.478 SYMLINK libspdk_dma.so 00:02:39.478 SYMLINK libspdk_ioat.so 00:02:39.478 LIB libspdk_vfio_user.a 00:02:39.478 LIB libspdk_util.a 00:02:39.478 SO libspdk_vfio_user.so.5.0 00:02:39.759 SO libspdk_util.so.10.0 00:02:39.759 SYMLINK libspdk_vfio_user.so 00:02:39.759 SYMLINK libspdk_util.so 00:02:40.020 LIB libspdk_trace_parser.a 00:02:40.020 SO libspdk_trace_parser.so.5.0 00:02:40.020 SYMLINK libspdk_trace_parser.so 00:02:40.279 CC lib/vmd/vmd.o 00:02:40.279 CC lib/env_dpdk/env.o 00:02:40.279 CC lib/vmd/led.o 00:02:40.280 CC lib/idxd/idxd.o 00:02:40.280 CC lib/env_dpdk/memory.o 00:02:40.280 CC lib/idxd/idxd_user.o 00:02:40.280 CC lib/json/json_parse.o 00:02:40.280 CC lib/env_dpdk/pci.o 00:02:40.280 CC lib/rdma_provider/common.o 00:02:40.280 CC lib/idxd/idxd_kernel.o 00:02:40.280 CC lib/json/json_util.o 00:02:40.280 CC lib/env_dpdk/init.o 00:02:40.280 CC lib/rdma_provider/rdma_provider_verbs.o 00:02:40.280 CC lib/json/json_write.o 00:02:40.280 CC lib/env_dpdk/threads.o 00:02:40.280 CC lib/reduce/reduce.o 00:02:40.280 CC lib/conf/conf.o 00:02:40.280 CC lib/env_dpdk/pci_ioat.o 00:02:40.280 CC lib/rdma_utils/rdma_utils.o 00:02:40.280 CC lib/env_dpdk/pci_virtio.o 00:02:40.280 CC lib/env_dpdk/pci_vmd.o 00:02:40.280 CC lib/env_dpdk/pci_idxd.o 00:02:40.280 CC lib/env_dpdk/pci_event.o 00:02:40.280 CC lib/env_dpdk/sigbus_handler.o 00:02:40.280 CC lib/env_dpdk/pci_dpdk.o 00:02:40.280 CC lib/env_dpdk/pci_dpdk_2207.o 00:02:40.280 CC lib/env_dpdk/pci_dpdk_2211.o 00:02:40.280 LIB libspdk_rdma_provider.a 00:02:40.539 SO libspdk_rdma_provider.so.6.0 00:02:40.539 LIB libspdk_rdma_utils.a 00:02:40.539 LIB libspdk_json.a 00:02:40.539 SO libspdk_rdma_utils.so.1.0 00:02:40.539 SYMLINK libspdk_rdma_provider.so 00:02:40.540 SO libspdk_json.so.6.0 00:02:40.540 SYMLINK libspdk_rdma_utils.so 00:02:40.540 SYMLINK libspdk_json.so 00:02:40.801 LIB libspdk_idxd.a 00:02:40.801 LIB libspdk_conf.a 00:02:40.801 SO libspdk_conf.so.6.0 00:02:40.801 SO libspdk_idxd.so.12.0 00:02:40.801 LIB libspdk_vmd.a 00:02:40.801 LIB libspdk_reduce.a 00:02:40.801 SO libspdk_vmd.so.6.0 00:02:40.801 SYMLINK libspdk_conf.so 00:02:40.801 SYMLINK libspdk_idxd.so 00:02:40.801 SO libspdk_reduce.so.6.1 00:02:40.801 SYMLINK libspdk_vmd.so 00:02:40.801 SYMLINK libspdk_reduce.so 00:02:41.061 CC lib/jsonrpc/jsonrpc_server.o 00:02:41.061 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:41.061 CC lib/jsonrpc/jsonrpc_client.o 00:02:41.061 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:41.321 LIB libspdk_jsonrpc.a 00:02:41.321 SO libspdk_jsonrpc.so.6.0 00:02:41.321 LIB libspdk_env_dpdk.a 00:02:41.321 SYMLINK libspdk_jsonrpc.so 00:02:41.321 SO libspdk_env_dpdk.so.15.0 00:02:41.581 SYMLINK libspdk_env_dpdk.so 00:02:41.581 CC lib/rpc/rpc.o 00:02:41.841 LIB libspdk_rpc.a 00:02:41.841 SO libspdk_rpc.so.6.0 00:02:42.101 SYMLINK libspdk_rpc.so 00:02:42.362 CC lib/notify/notify.o 00:02:42.362 CC lib/notify/notify_rpc.o 00:02:42.362 CC lib/keyring/keyring.o 00:02:42.362 CC lib/keyring/keyring_rpc.o 00:02:42.362 CC lib/trace/trace.o 00:02:42.362 CC lib/trace/trace_flags.o 00:02:42.362 CC lib/trace/trace_rpc.o 00:02:42.624 LIB libspdk_notify.a 00:02:42.624 SO libspdk_notify.so.6.0 00:02:42.624 LIB libspdk_keyring.a 00:02:42.624 LIB libspdk_trace.a 00:02:42.624 SO libspdk_keyring.so.1.0 00:02:42.624 SO libspdk_trace.so.10.0 00:02:42.624 SYMLINK libspdk_notify.so 00:02:42.624 SYMLINK libspdk_keyring.so 00:02:42.888 SYMLINK libspdk_trace.so 00:02:43.149 CC lib/thread/thread.o 00:02:43.149 CC lib/thread/iobuf.o 00:02:43.149 CC lib/sock/sock.o 00:02:43.149 CC lib/sock/sock_rpc.o 00:02:43.409 LIB libspdk_sock.a 00:02:43.409 SO libspdk_sock.so.10.0 00:02:43.670 SYMLINK libspdk_sock.so 00:02:43.930 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:43.930 CC lib/nvme/nvme_ctrlr.o 00:02:43.930 CC lib/nvme/nvme_fabric.o 00:02:43.930 CC lib/nvme/nvme_ns_cmd.o 00:02:43.930 CC lib/nvme/nvme_ns.o 00:02:43.930 CC lib/nvme/nvme_pcie_common.o 00:02:43.930 CC lib/nvme/nvme_pcie.o 00:02:43.930 CC lib/nvme/nvme_qpair.o 00:02:43.930 CC lib/nvme/nvme.o 00:02:43.930 CC lib/nvme/nvme_quirks.o 00:02:43.930 CC lib/nvme/nvme_transport.o 00:02:43.930 CC lib/nvme/nvme_discovery.o 00:02:43.930 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:43.930 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:43.930 CC lib/nvme/nvme_tcp.o 00:02:43.930 CC lib/nvme/nvme_opal.o 00:02:43.930 CC lib/nvme/nvme_io_msg.o 00:02:43.930 CC lib/nvme/nvme_poll_group.o 00:02:43.930 CC lib/nvme/nvme_zns.o 00:02:43.930 CC lib/nvme/nvme_stubs.o 00:02:43.930 CC lib/nvme/nvme_auth.o 00:02:43.930 CC lib/nvme/nvme_cuse.o 00:02:43.930 CC lib/nvme/nvme_rdma.o 00:02:44.500 LIB libspdk_thread.a 00:02:44.500 SO libspdk_thread.so.10.1 00:02:44.500 SYMLINK libspdk_thread.so 00:02:44.761 CC lib/virtio/virtio.o 00:02:44.761 CC lib/accel/accel.o 00:02:44.761 CC lib/virtio/virtio_vhost_user.o 00:02:44.761 CC lib/accel/accel_rpc.o 00:02:44.761 CC lib/virtio/virtio_vfio_user.o 00:02:44.761 CC lib/accel/accel_sw.o 00:02:44.761 CC lib/virtio/virtio_pci.o 00:02:44.761 CC lib/init/json_config.o 00:02:44.761 CC lib/init/subsystem.o 00:02:44.761 CC lib/init/subsystem_rpc.o 00:02:44.761 CC lib/init/rpc.o 00:02:44.761 CC lib/blob/blobstore.o 00:02:44.761 CC lib/blob/request.o 00:02:44.761 CC lib/blob/zeroes.o 00:02:44.761 CC lib/blob/blob_bs_dev.o 00:02:45.022 LIB libspdk_init.a 00:02:45.284 LIB libspdk_virtio.a 00:02:45.284 SO libspdk_init.so.5.0 00:02:45.284 SO libspdk_virtio.so.7.0 00:02:45.284 SYMLINK libspdk_init.so 00:02:45.284 SYMLINK libspdk_virtio.so 00:02:45.544 CC lib/event/app.o 00:02:45.544 CC lib/event/reactor.o 00:02:45.544 CC lib/event/log_rpc.o 00:02:45.544 CC lib/event/app_rpc.o 00:02:45.544 CC lib/event/scheduler_static.o 00:02:45.544 LIB libspdk_accel.a 00:02:45.804 SO libspdk_accel.so.16.0 00:02:45.804 LIB libspdk_nvme.a 00:02:45.804 SYMLINK libspdk_accel.so 00:02:45.804 SO libspdk_nvme.so.13.1 00:02:46.065 LIB libspdk_event.a 00:02:46.065 SO libspdk_event.so.14.0 00:02:46.065 SYMLINK libspdk_event.so 00:02:46.065 SYMLINK libspdk_nvme.so 00:02:46.065 CC lib/bdev/bdev.o 00:02:46.065 CC lib/bdev/bdev_rpc.o 00:02:46.065 CC lib/bdev/bdev_zone.o 00:02:46.065 CC lib/bdev/part.o 00:02:46.065 CC lib/bdev/scsi_nvme.o 00:02:47.446 LIB libspdk_blob.a 00:02:47.446 SO libspdk_blob.so.11.0 00:02:47.446 SYMLINK libspdk_blob.so 00:02:47.708 CC lib/lvol/lvol.o 00:02:47.708 CC lib/blobfs/blobfs.o 00:02:47.708 CC lib/blobfs/tree.o 00:02:48.280 LIB libspdk_bdev.a 00:02:48.280 SO libspdk_bdev.so.16.0 00:02:48.541 SYMLINK libspdk_bdev.so 00:02:48.541 LIB libspdk_blobfs.a 00:02:48.541 SO libspdk_blobfs.so.10.0 00:02:48.541 LIB libspdk_lvol.a 00:02:48.541 SO libspdk_lvol.so.10.0 00:02:48.541 SYMLINK libspdk_blobfs.so 00:02:48.802 SYMLINK libspdk_lvol.so 00:02:48.802 CC lib/ublk/ublk.o 00:02:48.802 CC lib/ublk/ublk_rpc.o 00:02:48.802 CC lib/nbd/nbd.o 00:02:48.802 CC lib/nbd/nbd_rpc.o 00:02:48.802 CC lib/scsi/dev.o 00:02:48.802 CC lib/ftl/ftl_core.o 00:02:48.802 CC lib/nvmf/ctrlr.o 00:02:48.802 CC lib/scsi/lun.o 00:02:48.802 CC lib/ftl/ftl_init.o 00:02:48.802 CC lib/nvmf/ctrlr_discovery.o 00:02:48.802 CC lib/scsi/port.o 00:02:48.802 CC lib/nvmf/ctrlr_bdev.o 00:02:48.802 CC lib/ftl/ftl_layout.o 00:02:48.802 CC lib/scsi/scsi.o 00:02:48.802 CC lib/nvmf/subsystem.o 00:02:48.802 CC lib/scsi/scsi_bdev.o 00:02:48.802 CC lib/ftl/ftl_debug.o 00:02:48.802 CC lib/nvmf/nvmf.o 00:02:48.802 CC lib/scsi/scsi_pr.o 00:02:48.802 CC lib/ftl/ftl_io.o 00:02:48.802 CC lib/nvmf/nvmf_rpc.o 00:02:48.802 CC lib/ftl/ftl_sb.o 00:02:48.802 CC lib/scsi/scsi_rpc.o 00:02:48.802 CC lib/nvmf/transport.o 00:02:48.802 CC lib/ftl/ftl_l2p.o 00:02:48.802 CC lib/scsi/task.o 00:02:48.802 CC lib/nvmf/tcp.o 00:02:48.802 CC lib/ftl/ftl_l2p_flat.o 00:02:48.802 CC lib/nvmf/stubs.o 00:02:48.802 CC lib/ftl/ftl_nv_cache.o 00:02:48.802 CC lib/nvmf/mdns_server.o 00:02:48.802 CC lib/ftl/ftl_band.o 00:02:48.802 CC lib/nvmf/rdma.o 00:02:48.802 CC lib/nvmf/auth.o 00:02:48.802 CC lib/ftl/ftl_band_ops.o 00:02:48.802 CC lib/ftl/ftl_writer.o 00:02:48.802 CC lib/ftl/ftl_rq.o 00:02:48.802 CC lib/ftl/ftl_reloc.o 00:02:48.802 CC lib/ftl/ftl_l2p_cache.o 00:02:48.802 CC lib/ftl/ftl_p2l.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:48.802 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:48.802 CC lib/ftl/utils/ftl_conf.o 00:02:48.802 CC lib/ftl/utils/ftl_md.o 00:02:48.802 CC lib/ftl/utils/ftl_mempool.o 00:02:48.802 CC lib/ftl/utils/ftl_bitmap.o 00:02:48.802 CC lib/ftl/utils/ftl_property.o 00:02:48.802 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:48.802 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:48.802 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:48.802 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:48.802 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:48.802 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:48.802 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:48.802 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:48.802 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:48.802 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:48.802 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:48.802 CC lib/ftl/base/ftl_base_dev.o 00:02:48.802 CC lib/ftl/ftl_trace.o 00:02:48.802 CC lib/ftl/base/ftl_base_bdev.o 00:02:49.372 LIB libspdk_scsi.a 00:02:49.695 LIB libspdk_nbd.a 00:02:49.695 SO libspdk_scsi.so.9.0 00:02:49.695 SO libspdk_nbd.so.7.0 00:02:49.695 LIB libspdk_ublk.a 00:02:49.695 SYMLINK libspdk_scsi.so 00:02:49.695 SYMLINK libspdk_nbd.so 00:02:49.695 SO libspdk_ublk.so.3.0 00:02:49.695 SYMLINK libspdk_ublk.so 00:02:49.956 CC lib/iscsi/conn.o 00:02:49.956 CC lib/vhost/vhost.o 00:02:49.956 CC lib/iscsi/iscsi.o 00:02:49.956 CC lib/iscsi/init_grp.o 00:02:49.956 CC lib/vhost/vhost_rpc.o 00:02:49.956 CC lib/vhost/vhost_scsi.o 00:02:49.956 CC lib/iscsi/md5.o 00:02:49.956 CC lib/iscsi/param.o 00:02:49.956 CC lib/vhost/vhost_blk.o 00:02:49.956 CC lib/iscsi/portal_grp.o 00:02:49.956 CC lib/vhost/rte_vhost_user.o 00:02:49.956 CC lib/iscsi/tgt_node.o 00:02:49.956 CC lib/iscsi/iscsi_subsystem.o 00:02:49.956 CC lib/iscsi/iscsi_rpc.o 00:02:49.956 CC lib/iscsi/task.o 00:02:49.956 LIB libspdk_ftl.a 00:02:50.217 SO libspdk_ftl.so.9.0 00:02:50.478 SYMLINK libspdk_ftl.so 00:02:51.050 LIB libspdk_vhost.a 00:02:51.050 SO libspdk_vhost.so.8.0 00:02:51.050 SYMLINK libspdk_vhost.so 00:02:51.050 LIB libspdk_iscsi.a 00:02:51.311 SO libspdk_iscsi.so.8.0 00:02:51.311 SYMLINK libspdk_iscsi.so 00:02:52.697 LIB libspdk_nvmf.a 00:02:52.697 SO libspdk_nvmf.so.19.0 00:02:52.697 SYMLINK libspdk_nvmf.so 00:02:53.268 CC module/env_dpdk/env_dpdk_rpc.o 00:02:53.528 LIB libspdk_env_dpdk_rpc.a 00:02:53.528 CC module/keyring/file/keyring_rpc.o 00:02:53.528 CC module/keyring/file/keyring.o 00:02:53.528 CC module/accel/error/accel_error.o 00:02:53.528 CC module/accel/error/accel_error_rpc.o 00:02:53.528 CC module/keyring/linux/keyring.o 00:02:53.528 CC module/accel/ioat/accel_ioat.o 00:02:53.528 CC module/keyring/linux/keyring_rpc.o 00:02:53.528 CC module/accel/ioat/accel_ioat_rpc.o 00:02:53.528 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:53.528 CC module/accel/dsa/accel_dsa.o 00:02:53.528 CC module/sock/posix/posix.o 00:02:53.528 CC module/accel/dsa/accel_dsa_rpc.o 00:02:53.528 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:53.528 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:53.528 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:53.528 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:53.528 CC module/accel/iaa/accel_iaa.o 00:02:53.528 SO libspdk_env_dpdk_rpc.so.6.0 00:02:53.528 CC module/blob/bdev/blob_bdev.o 00:02:53.528 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:53.528 CC module/accel/iaa/accel_iaa_rpc.o 00:02:53.528 CC module/scheduler/gscheduler/gscheduler.o 00:02:53.528 SYMLINK libspdk_env_dpdk_rpc.so 00:02:53.788 LIB libspdk_keyring_file.a 00:02:53.788 LIB libspdk_scheduler_gscheduler.a 00:02:53.788 LIB libspdk_accel_error.a 00:02:53.788 LIB libspdk_accel_ioat.a 00:02:53.788 LIB libspdk_scheduler_dpdk_governor.a 00:02:53.788 SO libspdk_keyring_file.so.1.0 00:02:53.788 SO libspdk_scheduler_gscheduler.so.4.0 00:02:53.788 LIB libspdk_accel_iaa.a 00:02:53.788 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:53.788 LIB libspdk_scheduler_dynamic.a 00:02:53.788 SO libspdk_accel_ioat.so.6.0 00:02:53.788 SO libspdk_accel_error.so.2.0 00:02:53.788 SO libspdk_accel_iaa.so.3.0 00:02:53.788 SYMLINK libspdk_scheduler_gscheduler.so 00:02:53.788 SYMLINK libspdk_keyring_file.so 00:02:53.788 SO libspdk_scheduler_dynamic.so.4.0 00:02:53.788 LIB libspdk_blob_bdev.a 00:02:53.788 LIB libspdk_keyring_linux.a 00:02:53.788 SYMLINK libspdk_accel_error.so 00:02:53.788 SYMLINK libspdk_accel_ioat.so 00:02:53.788 SO libspdk_blob_bdev.so.11.0 00:02:53.788 LIB libspdk_accel_dsa.a 00:02:53.788 SYMLINK libspdk_accel_iaa.so 00:02:53.788 SO libspdk_keyring_linux.so.1.0 00:02:53.788 SYMLINK libspdk_scheduler_dynamic.so 00:02:53.788 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:54.047 SYMLINK libspdk_blob_bdev.so 00:02:54.047 SO libspdk_accel_dsa.so.5.0 00:02:54.047 SYMLINK libspdk_keyring_linux.so 00:02:54.047 SYMLINK libspdk_accel_dsa.so 00:02:54.308 LIB libspdk_sock_posix.a 00:02:54.308 SO libspdk_sock_posix.so.6.0 00:02:54.308 SYMLINK libspdk_sock_posix.so 00:02:54.568 LIB libspdk_accel_dpdk_compressdev.a 00:02:54.568 CC module/blobfs/bdev/blobfs_bdev.o 00:02:54.568 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:54.568 CC module/bdev/malloc/bdev_malloc.o 00:02:54.568 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:54.568 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:54.568 CC module/bdev/delay/vbdev_delay.o 00:02:54.568 CC module/bdev/error/vbdev_error.o 00:02:54.568 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:54.568 CC module/bdev/aio/bdev_aio.o 00:02:54.568 CC module/bdev/null/bdev_null.o 00:02:54.568 CC module/bdev/gpt/gpt.o 00:02:54.568 CC module/bdev/null/bdev_null_rpc.o 00:02:54.568 CC module/bdev/aio/bdev_aio_rpc.o 00:02:54.568 CC module/bdev/error/vbdev_error_rpc.o 00:02:54.568 CC module/bdev/gpt/vbdev_gpt.o 00:02:54.568 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:54.568 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:54.568 CC module/bdev/crypto/vbdev_crypto.o 00:02:54.568 CC module/bdev/raid/bdev_raid.o 00:02:54.568 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:54.568 CC module/bdev/split/vbdev_split.o 00:02:54.568 CC module/bdev/raid/bdev_raid_rpc.o 00:02:54.568 CC module/bdev/raid/bdev_raid_sb.o 00:02:54.568 CC module/bdev/passthru/vbdev_passthru.o 00:02:54.568 CC module/bdev/raid/raid1.o 00:02:54.568 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:54.568 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:54.568 CC module/bdev/raid/raid0.o 00:02:54.568 CC module/bdev/lvol/vbdev_lvol.o 00:02:54.568 CC module/bdev/ftl/bdev_ftl.o 00:02:54.568 CC module/bdev/raid/concat.o 00:02:54.568 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:54.568 CC module/bdev/split/vbdev_split_rpc.o 00:02:54.568 CC module/bdev/nvme/nvme_rpc.o 00:02:54.568 CC module/bdev/nvme/bdev_nvme.o 00:02:54.568 CC module/bdev/compress/vbdev_compress.o 00:02:54.568 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:54.568 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:54.568 CC module/bdev/nvme/bdev_mdns_client.o 00:02:54.568 CC module/bdev/nvme/vbdev_opal.o 00:02:54.568 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:54.568 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:54.568 CC module/bdev/iscsi/bdev_iscsi.o 00:02:54.568 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:54.568 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:54.568 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:54.568 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:54.568 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:54.829 LIB libspdk_accel_dpdk_cryptodev.a 00:02:54.829 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:54.829 LIB libspdk_blobfs_bdev.a 00:02:54.829 LIB libspdk_bdev_null.a 00:02:54.829 SO libspdk_blobfs_bdev.so.6.0 00:02:54.829 LIB libspdk_bdev_split.a 00:02:54.829 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:54.829 SO libspdk_bdev_split.so.6.0 00:02:54.829 LIB libspdk_bdev_error.a 00:02:54.829 SO libspdk_bdev_null.so.6.0 00:02:54.829 LIB libspdk_bdev_passthru.a 00:02:54.829 SO libspdk_bdev_error.so.6.0 00:02:54.829 SYMLINK libspdk_blobfs_bdev.so 00:02:54.829 LIB libspdk_bdev_aio.a 00:02:54.829 LIB libspdk_bdev_ftl.a 00:02:54.829 LIB libspdk_bdev_malloc.a 00:02:54.829 LIB libspdk_bdev_delay.a 00:02:54.829 SO libspdk_bdev_passthru.so.6.0 00:02:54.829 SO libspdk_bdev_ftl.so.6.0 00:02:54.829 SYMLINK libspdk_bdev_split.so 00:02:54.829 SYMLINK libspdk_bdev_null.so 00:02:54.829 LIB libspdk_bdev_crypto.a 00:02:54.829 SO libspdk_bdev_aio.so.6.0 00:02:54.829 SO libspdk_bdev_malloc.so.6.0 00:02:55.090 SO libspdk_bdev_delay.so.6.0 00:02:55.090 SYMLINK libspdk_bdev_error.so 00:02:55.090 SO libspdk_bdev_crypto.so.6.0 00:02:55.090 LIB libspdk_bdev_iscsi.a 00:02:55.090 LIB libspdk_bdev_compress.a 00:02:55.090 SYMLINK libspdk_bdev_ftl.so 00:02:55.090 SYMLINK libspdk_bdev_aio.so 00:02:55.090 SYMLINK libspdk_bdev_passthru.so 00:02:55.090 LIB libspdk_bdev_zone_block.a 00:02:55.090 SO libspdk_bdev_compress.so.6.0 00:02:55.090 SYMLINK libspdk_bdev_malloc.so 00:02:55.090 SO libspdk_bdev_iscsi.so.6.0 00:02:55.090 SYMLINK libspdk_bdev_delay.so 00:02:55.090 LIB libspdk_bdev_gpt.a 00:02:55.090 SYMLINK libspdk_bdev_crypto.so 00:02:55.090 SO libspdk_bdev_zone_block.so.6.0 00:02:55.090 LIB libspdk_bdev_lvol.a 00:02:55.090 SO libspdk_bdev_gpt.so.6.0 00:02:55.090 SYMLINK libspdk_bdev_compress.so 00:02:55.090 SYMLINK libspdk_bdev_iscsi.so 00:02:55.090 LIB libspdk_bdev_virtio.a 00:02:55.090 SO libspdk_bdev_lvol.so.6.0 00:02:55.090 SYMLINK libspdk_bdev_zone_block.so 00:02:55.090 SO libspdk_bdev_virtio.so.6.0 00:02:55.090 SYMLINK libspdk_bdev_gpt.so 00:02:55.090 SYMLINK libspdk_bdev_lvol.so 00:02:55.351 SYMLINK libspdk_bdev_virtio.so 00:02:55.351 LIB libspdk_bdev_raid.a 00:02:55.612 SO libspdk_bdev_raid.so.6.0 00:02:55.612 SYMLINK libspdk_bdev_raid.so 00:02:56.556 LIB libspdk_bdev_nvme.a 00:02:56.556 SO libspdk_bdev_nvme.so.7.0 00:02:56.817 SYMLINK libspdk_bdev_nvme.so 00:02:57.389 CC module/event/subsystems/iobuf/iobuf.o 00:02:57.389 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:57.389 CC module/event/subsystems/scheduler/scheduler.o 00:02:57.389 CC module/event/subsystems/vmd/vmd.o 00:02:57.389 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:57.389 CC module/event/subsystems/keyring/keyring.o 00:02:57.389 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:57.389 CC module/event/subsystems/sock/sock.o 00:02:57.650 LIB libspdk_event_scheduler.a 00:02:57.650 LIB libspdk_event_keyring.a 00:02:57.650 LIB libspdk_event_iobuf.a 00:02:57.650 LIB libspdk_event_vmd.a 00:02:57.650 LIB libspdk_event_sock.a 00:02:57.650 LIB libspdk_event_vhost_blk.a 00:02:57.650 SO libspdk_event_scheduler.so.4.0 00:02:57.650 SO libspdk_event_keyring.so.1.0 00:02:57.650 SO libspdk_event_vmd.so.6.0 00:02:57.650 SO libspdk_event_sock.so.5.0 00:02:57.650 SO libspdk_event_iobuf.so.3.0 00:02:57.650 SO libspdk_event_vhost_blk.so.3.0 00:02:57.650 SYMLINK libspdk_event_keyring.so 00:02:57.650 SYMLINK libspdk_event_scheduler.so 00:02:57.650 SYMLINK libspdk_event_vmd.so 00:02:57.650 SYMLINK libspdk_event_sock.so 00:02:57.650 SYMLINK libspdk_event_vhost_blk.so 00:02:57.650 SYMLINK libspdk_event_iobuf.so 00:02:58.222 CC module/event/subsystems/accel/accel.o 00:02:58.222 LIB libspdk_event_accel.a 00:02:58.222 SO libspdk_event_accel.so.6.0 00:02:58.483 SYMLINK libspdk_event_accel.so 00:02:58.744 CC module/event/subsystems/bdev/bdev.o 00:02:59.006 LIB libspdk_event_bdev.a 00:02:59.006 SO libspdk_event_bdev.so.6.0 00:02:59.006 SYMLINK libspdk_event_bdev.so 00:02:59.266 CC module/event/subsystems/nbd/nbd.o 00:02:59.266 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:59.266 CC module/event/subsystems/ublk/ublk.o 00:02:59.266 CC module/event/subsystems/scsi/scsi.o 00:02:59.266 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:59.526 LIB libspdk_event_nbd.a 00:02:59.526 LIB libspdk_event_ublk.a 00:02:59.526 LIB libspdk_event_scsi.a 00:02:59.526 SO libspdk_event_nbd.so.6.0 00:02:59.526 SO libspdk_event_ublk.so.3.0 00:02:59.526 SO libspdk_event_scsi.so.6.0 00:02:59.526 LIB libspdk_event_nvmf.a 00:02:59.526 SYMLINK libspdk_event_nbd.so 00:02:59.526 SYMLINK libspdk_event_ublk.so 00:02:59.526 SO libspdk_event_nvmf.so.6.0 00:02:59.787 SYMLINK libspdk_event_scsi.so 00:02:59.787 SYMLINK libspdk_event_nvmf.so 00:03:00.047 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:03:00.047 CC module/event/subsystems/iscsi/iscsi.o 00:03:00.307 LIB libspdk_event_vhost_scsi.a 00:03:00.307 LIB libspdk_event_iscsi.a 00:03:00.307 SO libspdk_event_vhost_scsi.so.3.0 00:03:00.307 SO libspdk_event_iscsi.so.6.0 00:03:00.307 SYMLINK libspdk_event_vhost_scsi.so 00:03:00.307 SYMLINK libspdk_event_iscsi.so 00:03:00.568 SO libspdk.so.6.0 00:03:00.568 SYMLINK libspdk.so 00:03:00.829 CC app/trace_record/trace_record.o 00:03:00.829 CXX app/trace/trace.o 00:03:00.829 TEST_HEADER include/spdk/accel.h 00:03:00.829 CC app/spdk_nvme_identify/identify.o 00:03:00.829 TEST_HEADER include/spdk/accel_module.h 00:03:00.829 TEST_HEADER include/spdk/assert.h 00:03:00.829 CC app/spdk_lspci/spdk_lspci.o 00:03:00.829 TEST_HEADER include/spdk/barrier.h 00:03:00.829 TEST_HEADER include/spdk/base64.h 00:03:00.829 TEST_HEADER include/spdk/bdev.h 00:03:00.829 TEST_HEADER include/spdk/bdev_module.h 00:03:00.829 CC app/spdk_nvme_discover/discovery_aer.o 00:03:00.829 TEST_HEADER include/spdk/bdev_zone.h 00:03:00.829 TEST_HEADER include/spdk/bit_array.h 00:03:00.829 TEST_HEADER include/spdk/bit_pool.h 00:03:00.829 TEST_HEADER include/spdk/blob_bdev.h 00:03:00.829 CC app/spdk_top/spdk_top.o 00:03:00.829 CC test/rpc_client/rpc_client_test.o 00:03:00.829 CC app/spdk_nvme_perf/perf.o 00:03:00.829 TEST_HEADER include/spdk/blobfs_bdev.h 00:03:00.829 TEST_HEADER include/spdk/blobfs.h 00:03:00.829 TEST_HEADER include/spdk/blob.h 00:03:00.829 TEST_HEADER include/spdk/config.h 00:03:00.829 TEST_HEADER include/spdk/conf.h 00:03:00.829 TEST_HEADER include/spdk/cpuset.h 00:03:00.829 TEST_HEADER include/spdk/crc16.h 00:03:00.829 TEST_HEADER include/spdk/crc32.h 00:03:00.829 TEST_HEADER include/spdk/crc64.h 00:03:00.829 TEST_HEADER include/spdk/dma.h 00:03:00.829 TEST_HEADER include/spdk/dif.h 00:03:00.829 TEST_HEADER include/spdk/endian.h 00:03:01.096 TEST_HEADER include/spdk/env_dpdk.h 00:03:01.096 TEST_HEADER include/spdk/env.h 00:03:01.096 TEST_HEADER include/spdk/event.h 00:03:01.096 TEST_HEADER include/spdk/fd_group.h 00:03:01.096 TEST_HEADER include/spdk/fd.h 00:03:01.096 TEST_HEADER include/spdk/file.h 00:03:01.096 TEST_HEADER include/spdk/ftl.h 00:03:01.096 TEST_HEADER include/spdk/gpt_spec.h 00:03:01.096 TEST_HEADER include/spdk/hexlify.h 00:03:01.096 TEST_HEADER include/spdk/idxd.h 00:03:01.096 TEST_HEADER include/spdk/histogram_data.h 00:03:01.096 CC app/iscsi_tgt/iscsi_tgt.o 00:03:01.096 TEST_HEADER include/spdk/init.h 00:03:01.096 TEST_HEADER include/spdk/idxd_spec.h 00:03:01.096 TEST_HEADER include/spdk/ioat.h 00:03:01.096 CC examples/interrupt_tgt/interrupt_tgt.o 00:03:01.096 TEST_HEADER include/spdk/iscsi_spec.h 00:03:01.096 TEST_HEADER include/spdk/ioat_spec.h 00:03:01.096 TEST_HEADER include/spdk/json.h 00:03:01.096 CC app/spdk_dd/spdk_dd.o 00:03:01.096 CC app/nvmf_tgt/nvmf_main.o 00:03:01.096 TEST_HEADER include/spdk/keyring.h 00:03:01.096 TEST_HEADER include/spdk/jsonrpc.h 00:03:01.096 TEST_HEADER include/spdk/keyring_module.h 00:03:01.096 TEST_HEADER include/spdk/likely.h 00:03:01.096 TEST_HEADER include/spdk/log.h 00:03:01.096 TEST_HEADER include/spdk/lvol.h 00:03:01.096 TEST_HEADER include/spdk/memory.h 00:03:01.096 TEST_HEADER include/spdk/nbd.h 00:03:01.096 TEST_HEADER include/spdk/mmio.h 00:03:01.096 TEST_HEADER include/spdk/net.h 00:03:01.096 TEST_HEADER include/spdk/notify.h 00:03:01.096 TEST_HEADER include/spdk/nvme.h 00:03:01.096 TEST_HEADER include/spdk/nvme_ocssd.h 00:03:01.096 TEST_HEADER include/spdk/nvme_intel.h 00:03:01.096 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:03:01.096 TEST_HEADER include/spdk/nvme_spec.h 00:03:01.096 TEST_HEADER include/spdk/nvme_zns.h 00:03:01.096 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:03:01.096 TEST_HEADER include/spdk/nvmf_cmd.h 00:03:01.096 CC app/spdk_tgt/spdk_tgt.o 00:03:01.096 TEST_HEADER include/spdk/nvmf.h 00:03:01.096 TEST_HEADER include/spdk/nvmf_spec.h 00:03:01.096 TEST_HEADER include/spdk/nvmf_transport.h 00:03:01.096 TEST_HEADER include/spdk/opal.h 00:03:01.096 TEST_HEADER include/spdk/opal_spec.h 00:03:01.096 TEST_HEADER include/spdk/pci_ids.h 00:03:01.096 TEST_HEADER include/spdk/pipe.h 00:03:01.096 TEST_HEADER include/spdk/queue.h 00:03:01.096 TEST_HEADER include/spdk/reduce.h 00:03:01.096 TEST_HEADER include/spdk/scheduler.h 00:03:01.096 TEST_HEADER include/spdk/rpc.h 00:03:01.096 TEST_HEADER include/spdk/scsi.h 00:03:01.096 TEST_HEADER include/spdk/scsi_spec.h 00:03:01.096 TEST_HEADER include/spdk/stdinc.h 00:03:01.096 TEST_HEADER include/spdk/sock.h 00:03:01.096 TEST_HEADER include/spdk/string.h 00:03:01.096 TEST_HEADER include/spdk/thread.h 00:03:01.096 TEST_HEADER include/spdk/trace.h 00:03:01.096 TEST_HEADER include/spdk/trace_parser.h 00:03:01.096 TEST_HEADER include/spdk/tree.h 00:03:01.096 TEST_HEADER include/spdk/ublk.h 00:03:01.096 TEST_HEADER include/spdk/util.h 00:03:01.096 TEST_HEADER include/spdk/uuid.h 00:03:01.096 TEST_HEADER include/spdk/version.h 00:03:01.096 TEST_HEADER include/spdk/vfio_user_pci.h 00:03:01.096 TEST_HEADER include/spdk/vhost.h 00:03:01.096 TEST_HEADER include/spdk/vfio_user_spec.h 00:03:01.096 TEST_HEADER include/spdk/vmd.h 00:03:01.096 TEST_HEADER include/spdk/xor.h 00:03:01.096 TEST_HEADER include/spdk/zipf.h 00:03:01.096 CXX test/cpp_headers/accel.o 00:03:01.096 CXX test/cpp_headers/accel_module.o 00:03:01.096 CXX test/cpp_headers/assert.o 00:03:01.096 CXX test/cpp_headers/barrier.o 00:03:01.096 CXX test/cpp_headers/bdev.o 00:03:01.096 CXX test/cpp_headers/base64.o 00:03:01.096 CXX test/cpp_headers/bdev_module.o 00:03:01.096 CXX test/cpp_headers/bit_array.o 00:03:01.096 CXX test/cpp_headers/bdev_zone.o 00:03:01.096 CXX test/cpp_headers/bit_pool.o 00:03:01.096 CXX test/cpp_headers/blob_bdev.o 00:03:01.096 CXX test/cpp_headers/blobfs_bdev.o 00:03:01.096 CXX test/cpp_headers/blob.o 00:03:01.096 CXX test/cpp_headers/blobfs.o 00:03:01.096 CXX test/cpp_headers/conf.o 00:03:01.096 CXX test/cpp_headers/config.o 00:03:01.096 CXX test/cpp_headers/cpuset.o 00:03:01.096 CXX test/cpp_headers/crc16.o 00:03:01.096 CXX test/cpp_headers/crc32.o 00:03:01.096 CXX test/cpp_headers/crc64.o 00:03:01.096 CXX test/cpp_headers/dif.o 00:03:01.096 CXX test/cpp_headers/env_dpdk.o 00:03:01.096 CXX test/cpp_headers/dma.o 00:03:01.096 CXX test/cpp_headers/endian.o 00:03:01.096 CXX test/cpp_headers/env.o 00:03:01.096 CXX test/cpp_headers/event.o 00:03:01.096 CXX test/cpp_headers/fd.o 00:03:01.096 CXX test/cpp_headers/fd_group.o 00:03:01.096 CXX test/cpp_headers/file.o 00:03:01.096 CXX test/cpp_headers/ftl.o 00:03:01.096 CXX test/cpp_headers/gpt_spec.o 00:03:01.096 CXX test/cpp_headers/hexlify.o 00:03:01.096 CXX test/cpp_headers/histogram_data.o 00:03:01.096 CXX test/cpp_headers/idxd.o 00:03:01.096 CXX test/cpp_headers/init.o 00:03:01.096 CXX test/cpp_headers/idxd_spec.o 00:03:01.096 CXX test/cpp_headers/ioat_spec.o 00:03:01.096 CXX test/cpp_headers/ioat.o 00:03:01.096 CXX test/cpp_headers/iscsi_spec.o 00:03:01.096 CXX test/cpp_headers/json.o 00:03:01.096 CXX test/cpp_headers/jsonrpc.o 00:03:01.096 CXX test/cpp_headers/keyring_module.o 00:03:01.096 CXX test/cpp_headers/keyring.o 00:03:01.096 CXX test/cpp_headers/likely.o 00:03:01.096 CXX test/cpp_headers/memory.o 00:03:01.096 CXX test/cpp_headers/lvol.o 00:03:01.096 CXX test/cpp_headers/mmio.o 00:03:01.096 CXX test/cpp_headers/log.o 00:03:01.096 CXX test/cpp_headers/notify.o 00:03:01.096 CXX test/cpp_headers/nvme.o 00:03:01.096 CXX test/cpp_headers/nbd.o 00:03:01.096 CXX test/cpp_headers/net.o 00:03:01.096 CXX test/cpp_headers/nvme_intel.o 00:03:01.096 CXX test/cpp_headers/nvme_spec.o 00:03:01.096 CXX test/cpp_headers/nvme_ocssd.o 00:03:01.096 CXX test/cpp_headers/nvme_zns.o 00:03:01.096 CXX test/cpp_headers/nvme_ocssd_spec.o 00:03:01.096 CXX test/cpp_headers/nvmf_cmd.o 00:03:01.096 CXX test/cpp_headers/nvmf_fc_spec.o 00:03:01.096 CXX test/cpp_headers/nvmf.o 00:03:01.096 CXX test/cpp_headers/nvmf_transport.o 00:03:01.096 CXX test/cpp_headers/nvmf_spec.o 00:03:01.096 CC examples/util/zipf/zipf.o 00:03:01.097 CXX test/cpp_headers/opal.o 00:03:01.097 CXX test/cpp_headers/opal_spec.o 00:03:01.097 CXX test/cpp_headers/pci_ids.o 00:03:01.097 CXX test/cpp_headers/pipe.o 00:03:01.097 CXX test/cpp_headers/reduce.o 00:03:01.097 CXX test/cpp_headers/queue.o 00:03:01.097 CXX test/cpp_headers/scheduler.o 00:03:01.097 LINK spdk_lspci 00:03:01.097 CXX test/cpp_headers/scsi.o 00:03:01.097 CXX test/cpp_headers/rpc.o 00:03:01.097 CXX test/cpp_headers/scsi_spec.o 00:03:01.097 CC test/thread/poller_perf/poller_perf.o 00:03:01.097 CC examples/ioat/verify/verify.o 00:03:01.097 CXX test/cpp_headers/sock.o 00:03:01.097 CXX test/cpp_headers/string.o 00:03:01.097 CC test/env/pci/pci_ut.o 00:03:01.097 CXX test/cpp_headers/stdinc.o 00:03:01.097 CXX test/cpp_headers/thread.o 00:03:01.097 CXX test/cpp_headers/trace.o 00:03:01.097 CXX test/cpp_headers/trace_parser.o 00:03:01.097 CXX test/cpp_headers/tree.o 00:03:01.097 CXX test/cpp_headers/ublk.o 00:03:01.097 CXX test/cpp_headers/util.o 00:03:01.097 CC test/env/vtophys/vtophys.o 00:03:01.097 CXX test/cpp_headers/uuid.o 00:03:01.097 CXX test/cpp_headers/vfio_user_pci.o 00:03:01.097 CXX test/cpp_headers/version.o 00:03:01.097 CXX test/cpp_headers/vfio_user_spec.o 00:03:01.097 CC test/app/stub/stub.o 00:03:01.097 CXX test/cpp_headers/vhost.o 00:03:01.097 CC examples/ioat/perf/perf.o 00:03:01.097 CXX test/cpp_headers/xor.o 00:03:01.097 CXX test/cpp_headers/vmd.o 00:03:01.097 CXX test/cpp_headers/zipf.o 00:03:01.365 CC test/env/memory/memory_ut.o 00:03:01.365 CC test/app/jsoncat/jsoncat.o 00:03:01.365 CC app/fio/nvme/fio_plugin.o 00:03:01.365 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:03:01.365 CC test/app/histogram_perf/histogram_perf.o 00:03:01.365 CC test/app/bdev_svc/bdev_svc.o 00:03:01.365 CC app/fio/bdev/fio_plugin.o 00:03:01.365 LINK spdk_nvme_discover 00:03:01.365 LINK spdk_trace_record 00:03:01.365 CC test/dma/test_dma/test_dma.o 00:03:01.365 LINK rpc_client_test 00:03:01.631 LINK interrupt_tgt 00:03:01.631 LINK nvmf_tgt 00:03:01.631 LINK zipf 00:03:01.631 LINK iscsi_tgt 00:03:01.631 CC test/env/mem_callbacks/mem_callbacks.o 00:03:01.892 LINK spdk_tgt 00:03:01.892 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:03:01.892 LINK spdk_dd 00:03:01.892 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:03:01.892 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:03:01.892 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:03:02.151 LINK vtophys 00:03:02.151 LINK stub 00:03:02.151 LINK histogram_perf 00:03:02.151 LINK jsoncat 00:03:02.151 LINK env_dpdk_post_init 00:03:02.151 LINK poller_perf 00:03:02.151 LINK spdk_trace 00:03:02.151 LINK bdev_svc 00:03:02.411 LINK verify 00:03:02.411 LINK ioat_perf 00:03:02.411 LINK test_dma 00:03:02.411 CC examples/idxd/perf/perf.o 00:03:02.411 LINK pci_ut 00:03:02.411 CC examples/vmd/led/led.o 00:03:02.411 CC examples/thread/thread/thread_ex.o 00:03:02.411 CC examples/vmd/lsvmd/lsvmd.o 00:03:02.670 CC examples/sock/hello_world/hello_sock.o 00:03:02.670 LINK vhost_fuzz 00:03:02.670 LINK spdk_bdev 00:03:02.670 LINK nvme_fuzz 00:03:02.670 LINK spdk_nvme 00:03:02.670 LINK spdk_nvme_perf 00:03:02.670 CC app/vhost/vhost.o 00:03:02.670 LINK spdk_top 00:03:02.670 LINK spdk_nvme_identify 00:03:02.670 LINK mem_callbacks 00:03:02.670 LINK lsvmd 00:03:02.670 CC test/event/event_perf/event_perf.o 00:03:02.670 CC test/event/reactor/reactor.o 00:03:02.670 CC test/event/reactor_perf/reactor_perf.o 00:03:02.670 CC test/event/app_repeat/app_repeat.o 00:03:02.930 CC test/event/scheduler/scheduler.o 00:03:02.930 LINK hello_sock 00:03:02.930 LINK idxd_perf 00:03:02.930 LINK thread 00:03:02.930 LINK vhost 00:03:02.930 LINK led 00:03:02.930 CC test/nvme/sgl/sgl.o 00:03:02.930 LINK event_perf 00:03:02.930 CC test/nvme/reset/reset.o 00:03:02.930 CC test/nvme/reserve/reserve.o 00:03:02.930 CC test/nvme/startup/startup.o 00:03:02.930 CC test/nvme/compliance/nvme_compliance.o 00:03:02.931 CC test/nvme/fdp/fdp.o 00:03:02.931 CC test/nvme/fused_ordering/fused_ordering.o 00:03:02.931 CC test/nvme/err_injection/err_injection.o 00:03:02.931 CC test/nvme/overhead/overhead.o 00:03:02.931 CC test/nvme/boot_partition/boot_partition.o 00:03:02.931 CC test/nvme/e2edp/nvme_dp.o 00:03:02.931 CC test/nvme/aer/aer.o 00:03:02.931 LINK reactor 00:03:02.931 CC test/nvme/connect_stress/connect_stress.o 00:03:02.931 CC test/nvme/doorbell_aers/doorbell_aers.o 00:03:02.931 CC test/nvme/cuse/cuse.o 00:03:02.931 CC test/nvme/simple_copy/simple_copy.o 00:03:02.931 LINK reactor_perf 00:03:02.931 CC test/accel/dif/dif.o 00:03:02.931 CC test/blobfs/mkfs/mkfs.o 00:03:02.931 LINK app_repeat 00:03:03.192 CC test/lvol/esnap/esnap.o 00:03:03.192 LINK scheduler 00:03:03.192 LINK startup 00:03:03.192 LINK memory_ut 00:03:03.192 LINK doorbell_aers 00:03:03.192 LINK err_injection 00:03:03.192 LINK connect_stress 00:03:03.192 LINK fused_ordering 00:03:03.192 LINK simple_copy 00:03:03.192 LINK sgl 00:03:03.192 LINK nvme_dp 00:03:03.192 LINK mkfs 00:03:03.192 LINK reset 00:03:03.192 LINK nvme_compliance 00:03:03.192 LINK overhead 00:03:03.192 LINK fdp 00:03:03.192 LINK aer 00:03:03.453 LINK boot_partition 00:03:03.453 CC examples/nvme/abort/abort.o 00:03:03.453 CC examples/nvme/cmb_copy/cmb_copy.o 00:03:03.453 CC examples/nvme/nvme_manage/nvme_manage.o 00:03:03.453 CC examples/nvme/hello_world/hello_world.o 00:03:03.453 CC examples/nvme/arbitration/arbitration.o 00:03:03.453 CC examples/nvme/hotplug/hotplug.o 00:03:03.453 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:03:03.453 CC examples/nvme/reconnect/reconnect.o 00:03:03.453 LINK reserve 00:03:03.453 LINK dif 00:03:03.453 CC examples/accel/perf/accel_perf.o 00:03:03.453 CC examples/blob/hello_world/hello_blob.o 00:03:03.453 CC examples/blob/cli/blobcli.o 00:03:03.713 LINK cmb_copy 00:03:03.713 LINK pmr_persistence 00:03:03.713 LINK hello_world 00:03:03.713 LINK hotplug 00:03:03.713 LINK arbitration 00:03:03.713 LINK abort 00:03:03.713 LINK reconnect 00:03:03.713 LINK hello_blob 00:03:03.713 LINK nvme_manage 00:03:03.973 LINK accel_perf 00:03:03.973 LINK blobcli 00:03:03.973 LINK iscsi_fuzz 00:03:03.973 CC test/bdev/bdevio/bdevio.o 00:03:03.973 LINK cuse 00:03:04.544 LINK bdevio 00:03:04.544 CC examples/bdev/hello_world/hello_bdev.o 00:03:04.544 CC examples/bdev/bdevperf/bdevperf.o 00:03:04.805 LINK hello_bdev 00:03:05.375 LINK bdevperf 00:03:05.947 CC examples/nvmf/nvmf/nvmf.o 00:03:06.209 LINK nvmf 00:03:07.152 LINK esnap 00:03:07.413 00:03:07.413 real 1m19.924s 00:03:07.413 user 13m55.475s 00:03:07.413 sys 6m51.117s 00:03:07.413 13:11:48 make -- common/autotest_common.sh@1126 -- $ xtrace_disable 00:03:07.413 13:11:48 make -- common/autotest_common.sh@10 -- $ set +x 00:03:07.413 ************************************ 00:03:07.413 END TEST make 00:03:07.413 ************************************ 00:03:07.673 13:11:48 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:03:07.673 13:11:48 -- pm/common@29 -- $ signal_monitor_resources TERM 00:03:07.673 13:11:48 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:03:07.673 13:11:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.673 13:11:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:03:07.673 13:11:48 -- pm/common@44 -- $ pid=687491 00:03:07.673 13:11:48 -- pm/common@50 -- $ kill -TERM 687491 00:03:07.673 13:11:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.673 13:11:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:03:07.673 13:11:48 -- pm/common@44 -- $ pid=687492 00:03:07.673 13:11:48 -- pm/common@50 -- $ kill -TERM 687492 00:03:07.673 13:11:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.673 13:11:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:03:07.673 13:11:48 -- pm/common@44 -- $ pid=687494 00:03:07.673 13:11:48 -- pm/common@50 -- $ kill -TERM 687494 00:03:07.673 13:11:48 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.673 13:11:48 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:03:07.673 13:11:48 -- pm/common@44 -- $ pid=687517 00:03:07.673 13:11:48 -- pm/common@50 -- $ sudo -E kill -TERM 687517 00:03:07.673 13:11:48 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:03:07.673 13:11:48 -- nvmf/common.sh@7 -- # uname -s 00:03:07.673 13:11:48 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:03:07.673 13:11:48 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:03:07.673 13:11:48 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:03:07.673 13:11:48 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:03:07.673 13:11:48 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:03:07.673 13:11:48 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:03:07.673 13:11:48 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:03:07.673 13:11:48 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:03:07.673 13:11:48 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:03:07.673 13:11:48 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:03:07.673 13:11:48 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:03:07.673 13:11:48 -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:03:07.673 13:11:48 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:03:07.673 13:11:48 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:03:07.673 13:11:48 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:03:07.673 13:11:48 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:03:07.674 13:11:48 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:03:07.674 13:11:48 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:03:07.674 13:11:48 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:03:07.674 13:11:48 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:03:07.674 13:11:48 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:07.674 13:11:48 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:07.674 13:11:48 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:07.674 13:11:48 -- paths/export.sh@5 -- # export PATH 00:03:07.674 13:11:48 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:03:07.674 13:11:48 -- nvmf/common.sh@47 -- # : 0 00:03:07.674 13:11:48 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:03:07.674 13:11:48 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:03:07.674 13:11:48 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:03:07.674 13:11:48 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:03:07.674 13:11:48 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:03:07.674 13:11:48 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:03:07.674 13:11:48 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:03:07.674 13:11:48 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:03:07.674 13:11:48 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:03:07.674 13:11:48 -- spdk/autotest.sh@32 -- # uname -s 00:03:07.674 13:11:48 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:03:07.674 13:11:48 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:03:07.674 13:11:48 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:07.674 13:11:48 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:03:07.674 13:11:48 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:03:07.674 13:11:48 -- spdk/autotest.sh@44 -- # modprobe nbd 00:03:07.674 13:11:48 -- spdk/autotest.sh@46 -- # type -P udevadm 00:03:07.674 13:11:48 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:03:07.674 13:11:48 -- spdk/autotest.sh@48 -- # udevadm_pid=756090 00:03:07.674 13:11:48 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:03:07.674 13:11:48 -- pm/common@17 -- # local monitor 00:03:07.674 13:11:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.674 13:11:48 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:03:07.674 13:11:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.674 13:11:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.674 13:11:48 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:03:07.674 13:11:48 -- pm/common@21 -- # date +%s 00:03:07.674 13:11:48 -- pm/common@25 -- # sleep 1 00:03:07.674 13:11:48 -- pm/common@21 -- # date +%s 00:03:07.674 13:11:48 -- pm/common@21 -- # date +%s 00:03:07.674 13:11:48 -- pm/common@21 -- # date +%s 00:03:07.674 13:11:48 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721905908 00:03:07.674 13:11:48 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721905908 00:03:07.674 13:11:48 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721905908 00:03:07.674 13:11:48 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1721905908 00:03:07.934 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721905908_collect-vmstat.pm.log 00:03:07.934 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721905908_collect-cpu-temp.pm.log 00:03:07.934 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721905908_collect-cpu-load.pm.log 00:03:07.934 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1721905908_collect-bmc-pm.bmc.pm.log 00:03:08.874 13:11:49 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:03:08.874 13:11:49 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:03:08.874 13:11:49 -- common/autotest_common.sh@724 -- # xtrace_disable 00:03:08.874 13:11:49 -- common/autotest_common.sh@10 -- # set +x 00:03:08.874 13:11:49 -- spdk/autotest.sh@59 -- # create_test_list 00:03:08.874 13:11:49 -- common/autotest_common.sh@748 -- # xtrace_disable 00:03:08.874 13:11:49 -- common/autotest_common.sh@10 -- # set +x 00:03:08.874 13:11:49 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:03:08.874 13:11:49 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:08.874 13:11:49 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:08.874 13:11:49 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:03:08.874 13:11:49 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:03:08.874 13:11:49 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:03:08.874 13:11:49 -- common/autotest_common.sh@1455 -- # uname 00:03:08.874 13:11:49 -- common/autotest_common.sh@1455 -- # '[' Linux = FreeBSD ']' 00:03:08.874 13:11:49 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:03:08.874 13:11:49 -- common/autotest_common.sh@1475 -- # uname 00:03:08.874 13:11:49 -- common/autotest_common.sh@1475 -- # [[ Linux = FreeBSD ]] 00:03:08.874 13:11:49 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:03:08.874 13:11:49 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:03:08.874 13:11:49 -- spdk/autotest.sh@72 -- # hash lcov 00:03:08.874 13:11:49 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:03:08.874 13:11:49 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:03:08.874 --rc lcov_branch_coverage=1 00:03:08.874 --rc lcov_function_coverage=1 00:03:08.874 --rc genhtml_branch_coverage=1 00:03:08.874 --rc genhtml_function_coverage=1 00:03:08.874 --rc genhtml_legend=1 00:03:08.874 --rc geninfo_all_blocks=1 00:03:08.874 ' 00:03:08.874 13:11:49 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:03:08.874 --rc lcov_branch_coverage=1 00:03:08.874 --rc lcov_function_coverage=1 00:03:08.874 --rc genhtml_branch_coverage=1 00:03:08.874 --rc genhtml_function_coverage=1 00:03:08.874 --rc genhtml_legend=1 00:03:08.874 --rc geninfo_all_blocks=1 00:03:08.874 ' 00:03:08.874 13:11:49 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:03:08.874 --rc lcov_branch_coverage=1 00:03:08.874 --rc lcov_function_coverage=1 00:03:08.874 --rc genhtml_branch_coverage=1 00:03:08.874 --rc genhtml_function_coverage=1 00:03:08.874 --rc genhtml_legend=1 00:03:08.874 --rc geninfo_all_blocks=1 00:03:08.874 --no-external' 00:03:08.874 13:11:49 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:03:08.874 --rc lcov_branch_coverage=1 00:03:08.874 --rc lcov_function_coverage=1 00:03:08.874 --rc genhtml_branch_coverage=1 00:03:08.874 --rc genhtml_function_coverage=1 00:03:08.874 --rc genhtml_legend=1 00:03:08.874 --rc geninfo_all_blocks=1 00:03:08.874 --no-external' 00:03:08.874 13:11:49 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:03:08.874 lcov: LCOV version 1.14 00:03:08.874 13:11:49 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:03:47.661 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:03:47.661 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:04:02.603 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:04:02.603 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/net.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:04:02.604 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:04:02.604 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:04:02.605 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:04:02.605 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:04:03.991 13:12:44 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:04:03.991 13:12:44 -- common/autotest_common.sh@724 -- # xtrace_disable 00:04:03.991 13:12:44 -- common/autotest_common.sh@10 -- # set +x 00:04:03.991 13:12:44 -- spdk/autotest.sh@91 -- # rm -f 00:04:03.991 13:12:44 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:08.197 0000:80:01.6 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:80:01.7 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:80:01.4 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:80:01.5 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:80:01.2 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:80:01.3 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:80:01.0 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:80:01.1 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:65:00.0 (8086 0a54): Already using the nvme driver 00:04:08.197 0000:00:01.6 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:00:01.7 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:00:01.4 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:00:01.5 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:00:01.2 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:00:01.3 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:00:01.0 (8086 0b00): Already using the ioatdma driver 00:04:08.197 0000:00:01.1 (8086 0b00): Already using the ioatdma driver 00:04:08.197 13:12:48 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:04:08.197 13:12:48 -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:08.197 13:12:48 -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:08.197 13:12:48 -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:08.197 13:12:48 -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:08.197 13:12:48 -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:08.197 13:12:48 -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:08.197 13:12:48 -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:08.197 13:12:48 -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:08.197 13:12:48 -- spdk/autotest.sh@98 -- # (( 0 > 0 )) 00:04:08.197 13:12:48 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:04:08.197 13:12:48 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:04:08.197 13:12:48 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:04:08.197 13:12:48 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:04:08.197 13:12:48 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:04:08.197 No valid GPT data, bailing 00:04:08.197 13:12:48 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:04:08.197 13:12:48 -- scripts/common.sh@391 -- # pt= 00:04:08.197 13:12:48 -- scripts/common.sh@392 -- # return 1 00:04:08.197 13:12:48 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:04:08.197 1+0 records in 00:04:08.197 1+0 records out 00:04:08.197 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00552723 s, 190 MB/s 00:04:08.197 13:12:48 -- spdk/autotest.sh@118 -- # sync 00:04:08.197 13:12:48 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:04:08.197 13:12:48 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:04:08.197 13:12:48 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:04:16.335 13:12:56 -- spdk/autotest.sh@124 -- # uname -s 00:04:16.335 13:12:56 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:04:16.335 13:12:56 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:16.335 13:12:56 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:16.335 13:12:56 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:16.335 13:12:56 -- common/autotest_common.sh@10 -- # set +x 00:04:16.335 ************************************ 00:04:16.335 START TEST setup.sh 00:04:16.335 ************************************ 00:04:16.335 13:12:56 setup.sh -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:04:16.335 * Looking for test storage... 00:04:16.335 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:16.335 13:12:56 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:04:16.335 13:12:56 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:04:16.335 13:12:56 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:16.335 13:12:56 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:16.335 13:12:56 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:16.335 13:12:56 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:16.335 ************************************ 00:04:16.335 START TEST acl 00:04:16.335 ************************************ 00:04:16.335 13:12:56 setup.sh.acl -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:04:16.335 * Looking for test storage... 00:04:16.336 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:16.336 13:12:56 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1670 -- # local nvme bdf 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:16.336 13:12:56 setup.sh.acl -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:04:16.336 13:12:56 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:04:16.336 13:12:56 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:04:16.336 13:12:56 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:04:16.336 13:12:56 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:04:16.336 13:12:56 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:04:16.336 13:12:56 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:16.336 13:12:56 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:20.532 13:13:00 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:04:20.532 13:13:00 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:04:20.532 13:13:00 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:20.532 13:13:00 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:04:20.532 13:13:00 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:04:20.532 13:13:00 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:23.825 Hugepages 00:04:23.825 node hugesize free / total 00:04:23.825 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:23.825 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:23.825 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:23.825 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:23.825 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:23.825 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 00:04:24.086 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.0 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.1 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.2 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.3 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.4 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.5 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.6 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:01.7 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:65:00.0 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@21 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.0 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.1 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.2 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.3 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.4 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.5 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.6 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:01.7 == *:*:*.* ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:04:24.086 13:13:04 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:04:24.086 13:13:04 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:24.086 13:13:04 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:24.086 13:13:04 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:24.086 ************************************ 00:04:24.086 START TEST denied 00:04:24.086 ************************************ 00:04:24.086 13:13:04 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # denied 00:04:24.086 13:13:04 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED=' 0000:65:00.0' 00:04:24.086 13:13:04 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:04:24.086 13:13:04 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:65:00.0' 00:04:24.086 13:13:04 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:04:24.086 13:13:04 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:28.289 0000:65:00.0 (8086 0a54): Skipping denied controller at 0000:65:00.0 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:65:00.0 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:65:00.0 ]] 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:65:00.0/driver 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:28.289 13:13:08 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:33.587 00:04:33.587 real 0m9.343s 00:04:33.587 user 0m3.086s 00:04:33.587 sys 0m5.518s 00:04:33.587 13:13:14 setup.sh.acl.denied -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:33.587 13:13:14 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:04:33.587 ************************************ 00:04:33.587 END TEST denied 00:04:33.587 ************************************ 00:04:33.587 13:13:14 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:04:33.587 13:13:14 setup.sh.acl -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:33.587 13:13:14 setup.sh.acl -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:33.587 13:13:14 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:33.587 ************************************ 00:04:33.587 START TEST allowed 00:04:33.587 ************************************ 00:04:33.587 13:13:14 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # allowed 00:04:33.587 13:13:14 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:65:00.0 00:04:33.587 13:13:14 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:04:33.587 13:13:14 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:65:00.0 .*: nvme -> .*' 00:04:33.587 13:13:14 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:04:33.587 13:13:14 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:40.233 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:04:40.233 13:13:20 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:04:40.233 13:13:20 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:04:40.233 13:13:20 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:04:40.233 13:13:20 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:40.233 13:13:20 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:44.435 00:04:44.435 real 0m10.322s 00:04:44.435 user 0m3.134s 00:04:44.435 sys 0m5.427s 00:04:44.435 13:13:24 setup.sh.acl.allowed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:44.435 13:13:24 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:04:44.435 ************************************ 00:04:44.435 END TEST allowed 00:04:44.435 ************************************ 00:04:44.435 00:04:44.435 real 0m28.443s 00:04:44.435 user 0m9.461s 00:04:44.435 sys 0m16.711s 00:04:44.435 13:13:24 setup.sh.acl -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:44.435 13:13:24 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:04:44.435 ************************************ 00:04:44.435 END TEST acl 00:04:44.435 ************************************ 00:04:44.435 13:13:24 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:44.435 13:13:24 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:44.435 13:13:24 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:44.435 13:13:24 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:44.435 ************************************ 00:04:44.435 START TEST hugepages 00:04:44.435 ************************************ 00:04:44.435 13:13:24 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:04:44.435 * Looking for test storage... 00:04:44.435 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.435 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 108044036 kB' 'MemAvailable: 111264284 kB' 'Buffers: 2704 kB' 'Cached: 9766244 kB' 'SwapCached: 0 kB' 'Active: 6825516 kB' 'Inactive: 3510808 kB' 'Active(anon): 6425752 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 570868 kB' 'Mapped: 219200 kB' 'Shmem: 5858376 kB' 'KReclaimable: 268036 kB' 'Slab: 948408 kB' 'SReclaimable: 268036 kB' 'SUnreclaim: 680372 kB' 'KernelStack: 25104 kB' 'PageTables: 8856 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69463468 kB' 'Committed_AS: 7957588 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229568 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.436 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:04:44.437 13:13:24 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:04:44.437 13:13:24 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:44.437 13:13:24 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:44.437 13:13:24 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:44.437 ************************************ 00:04:44.437 START TEST default_setup 00:04:44.437 ************************************ 00:04:44.437 13:13:24 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # default_setup 00:04:44.437 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:04:44.437 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:04:44.437 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:04:44.438 13:13:24 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:48.643 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:04:48.643 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:04:50.030 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110198904 kB' 'MemAvailable: 113419152 kB' 'Buffers: 2704 kB' 'Cached: 9766392 kB' 'SwapCached: 0 kB' 'Active: 6840908 kB' 'Inactive: 3510808 kB' 'Active(anon): 6441144 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585668 kB' 'Mapped: 218456 kB' 'Shmem: 5858524 kB' 'KReclaimable: 268036 kB' 'Slab: 946676 kB' 'SReclaimable: 268036 kB' 'SUnreclaim: 678640 kB' 'KernelStack: 24992 kB' 'PageTables: 8516 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7983420 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229580 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.030 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.297 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110199584 kB' 'MemAvailable: 113419832 kB' 'Buffers: 2704 kB' 'Cached: 9766392 kB' 'SwapCached: 0 kB' 'Active: 6839892 kB' 'Inactive: 3510808 kB' 'Active(anon): 6440128 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585208 kB' 'Mapped: 218360 kB' 'Shmem: 5858524 kB' 'KReclaimable: 268036 kB' 'Slab: 946672 kB' 'SReclaimable: 268036 kB' 'SUnreclaim: 678636 kB' 'KernelStack: 24992 kB' 'PageTables: 8520 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7983436 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229564 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.298 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.299 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110199956 kB' 'MemAvailable: 113420204 kB' 'Buffers: 2704 kB' 'Cached: 9766428 kB' 'SwapCached: 0 kB' 'Active: 6839452 kB' 'Inactive: 3510808 kB' 'Active(anon): 6439688 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 584652 kB' 'Mapped: 218360 kB' 'Shmem: 5858560 kB' 'KReclaimable: 268036 kB' 'Slab: 946672 kB' 'SReclaimable: 268036 kB' 'SUnreclaim: 678636 kB' 'KernelStack: 24976 kB' 'PageTables: 8460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7983460 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229564 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.300 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.301 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:50.302 nr_hugepages=1024 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:50.302 resv_hugepages=0 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:50.302 surplus_hugepages=0 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:50.302 anon_hugepages=0 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:50.302 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110199452 kB' 'MemAvailable: 113419700 kB' 'Buffers: 2704 kB' 'Cached: 9766452 kB' 'SwapCached: 0 kB' 'Active: 6839364 kB' 'Inactive: 3510808 kB' 'Active(anon): 6439600 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 584548 kB' 'Mapped: 218360 kB' 'Shmem: 5858584 kB' 'KReclaimable: 268036 kB' 'Slab: 946672 kB' 'SReclaimable: 268036 kB' 'SUnreclaim: 678636 kB' 'KernelStack: 24976 kB' 'PageTables: 8460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7983480 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229580 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.303 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:50.304 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 59472372 kB' 'MemUsed: 6189628 kB' 'SwapCached: 0 kB' 'Active: 1851272 kB' 'Inactive: 166524 kB' 'Active(anon): 1670376 kB' 'Inactive(anon): 0 kB' 'Active(file): 180896 kB' 'Inactive(file): 166524 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1632252 kB' 'Mapped: 61012 kB' 'AnonPages: 388740 kB' 'Shmem: 1284832 kB' 'KernelStack: 12504 kB' 'PageTables: 5124 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 140448 kB' 'Slab: 475192 kB' 'SReclaimable: 140448 kB' 'SUnreclaim: 334744 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.305 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:04:50.306 node0=1024 expecting 1024 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:04:50.306 00:04:50.306 real 0m6.057s 00:04:50.306 user 0m1.755s 00:04:50.306 sys 0m2.594s 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:50.306 13:13:30 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:04:50.306 ************************************ 00:04:50.306 END TEST default_setup 00:04:50.306 ************************************ 00:04:50.306 13:13:31 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:04:50.306 13:13:31 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:50.306 13:13:31 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:50.306 13:13:31 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:50.306 ************************************ 00:04:50.306 START TEST per_node_1G_alloc 00:04:50.306 ************************************ 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # per_node_1G_alloc 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:50.306 13:13:31 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:54.514 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:54.514 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110208952 kB' 'MemAvailable: 113429184 kB' 'Buffers: 2704 kB' 'Cached: 9766560 kB' 'SwapCached: 0 kB' 'Active: 6836220 kB' 'Inactive: 3510808 kB' 'Active(anon): 6436456 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 581004 kB' 'Mapped: 217400 kB' 'Shmem: 5858692 kB' 'KReclaimable: 268004 kB' 'Slab: 946044 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 678040 kB' 'KernelStack: 25152 kB' 'PageTables: 8788 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7960972 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229788 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.514 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.515 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110209588 kB' 'MemAvailable: 113429820 kB' 'Buffers: 2704 kB' 'Cached: 9766564 kB' 'SwapCached: 0 kB' 'Active: 6836008 kB' 'Inactive: 3510808 kB' 'Active(anon): 6436244 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580832 kB' 'Mapped: 217400 kB' 'Shmem: 5858696 kB' 'KReclaimable: 268004 kB' 'Slab: 946040 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 678036 kB' 'KernelStack: 25120 kB' 'PageTables: 8992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7959376 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229820 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.516 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.517 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110208624 kB' 'MemAvailable: 113428856 kB' 'Buffers: 2704 kB' 'Cached: 9766584 kB' 'SwapCached: 0 kB' 'Active: 6835832 kB' 'Inactive: 3510808 kB' 'Active(anon): 6436068 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580600 kB' 'Mapped: 217372 kB' 'Shmem: 5858716 kB' 'KReclaimable: 268004 kB' 'Slab: 946048 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 678044 kB' 'KernelStack: 24944 kB' 'PageTables: 8676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7961012 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229804 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.518 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.519 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.520 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:54.521 nr_hugepages=1024 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:54.521 resv_hugepages=0 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:54.521 surplus_hugepages=0 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:54.521 anon_hugepages=0 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110210608 kB' 'MemAvailable: 113430840 kB' 'Buffers: 2704 kB' 'Cached: 9766608 kB' 'SwapCached: 0 kB' 'Active: 6836040 kB' 'Inactive: 3510808 kB' 'Active(anon): 6436276 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580760 kB' 'Mapped: 217372 kB' 'Shmem: 5858740 kB' 'KReclaimable: 268004 kB' 'Slab: 946048 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 678044 kB' 'KernelStack: 25120 kB' 'PageTables: 8608 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7961036 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229788 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.521 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.522 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60550108 kB' 'MemUsed: 5111892 kB' 'SwapCached: 0 kB' 'Active: 1847816 kB' 'Inactive: 166524 kB' 'Active(anon): 1666920 kB' 'Inactive(anon): 0 kB' 'Active(file): 180896 kB' 'Inactive(file): 166524 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1632308 kB' 'Mapped: 60440 kB' 'AnonPages: 385080 kB' 'Shmem: 1284888 kB' 'KernelStack: 12408 kB' 'PageTables: 4696 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 140448 kB' 'Slab: 474796 kB' 'SReclaimable: 140448 kB' 'SUnreclaim: 334348 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.523 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:04:54.524 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682032 kB' 'MemFree: 49661704 kB' 'MemUsed: 11020328 kB' 'SwapCached: 0 kB' 'Active: 4987684 kB' 'Inactive: 3344284 kB' 'Active(anon): 4768816 kB' 'Inactive(anon): 0 kB' 'Active(file): 218868 kB' 'Inactive(file): 3344284 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8137024 kB' 'Mapped: 156932 kB' 'AnonPages: 195104 kB' 'Shmem: 4573872 kB' 'KernelStack: 12488 kB' 'PageTables: 3800 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127556 kB' 'Slab: 471252 kB' 'SReclaimable: 127556 kB' 'SUnreclaim: 343696 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.525 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:54.526 node0=512 expecting 512 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:54.526 node1=512 expecting 512 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:54.526 00:04:54.526 real 0m4.157s 00:04:54.526 user 0m1.678s 00:04:54.526 sys 0m2.555s 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:54.526 13:13:35 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:54.526 ************************************ 00:04:54.526 END TEST per_node_1G_alloc 00:04:54.526 ************************************ 00:04:54.526 13:13:35 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:04:54.526 13:13:35 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:54.526 13:13:35 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:54.526 13:13:35 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:54.786 ************************************ 00:04:54.786 START TEST even_2G_alloc 00:04:54.786 ************************************ 00:04:54.786 13:13:35 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # even_2G_alloc 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:54.787 13:13:35 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:58.995 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:04:58.995 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110231856 kB' 'MemAvailable: 113452088 kB' 'Buffers: 2704 kB' 'Cached: 9766736 kB' 'SwapCached: 0 kB' 'Active: 6836668 kB' 'Inactive: 3510808 kB' 'Active(anon): 6436904 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 581320 kB' 'Mapped: 217548 kB' 'Shmem: 5858868 kB' 'KReclaimable: 268004 kB' 'Slab: 945276 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677272 kB' 'KernelStack: 24992 kB' 'PageTables: 8332 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7961656 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229788 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.995 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:58.996 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110237732 kB' 'MemAvailable: 113457964 kB' 'Buffers: 2704 kB' 'Cached: 9766740 kB' 'SwapCached: 0 kB' 'Active: 6836620 kB' 'Inactive: 3510808 kB' 'Active(anon): 6436856 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 581276 kB' 'Mapped: 217532 kB' 'Shmem: 5858872 kB' 'KReclaimable: 268004 kB' 'Slab: 945268 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677264 kB' 'KernelStack: 24832 kB' 'PageTables: 8288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7958816 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229692 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.997 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.998 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110237984 kB' 'MemAvailable: 113458216 kB' 'Buffers: 2704 kB' 'Cached: 9766756 kB' 'SwapCached: 0 kB' 'Active: 6835832 kB' 'Inactive: 3510808 kB' 'Active(anon): 6436068 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580444 kB' 'Mapped: 217372 kB' 'Shmem: 5858888 kB' 'KReclaimable: 268004 kB' 'Slab: 945240 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677236 kB' 'KernelStack: 24880 kB' 'PageTables: 8336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7958840 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229692 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:58.999 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.000 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:04:59.001 nr_hugepages=1024 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:04:59.001 resv_hugepages=0 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:04:59.001 surplus_hugepages=0 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:04:59.001 anon_hugepages=0 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110239788 kB' 'MemAvailable: 113460020 kB' 'Buffers: 2704 kB' 'Cached: 9766796 kB' 'SwapCached: 0 kB' 'Active: 6835716 kB' 'Inactive: 3510808 kB' 'Active(anon): 6435952 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 580300 kB' 'Mapped: 217372 kB' 'Shmem: 5858928 kB' 'KReclaimable: 268004 kB' 'Slab: 945240 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677236 kB' 'KernelStack: 24912 kB' 'PageTables: 8448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7958860 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229708 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.001 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.002 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60562580 kB' 'MemUsed: 5099420 kB' 'SwapCached: 0 kB' 'Active: 1848936 kB' 'Inactive: 166524 kB' 'Active(anon): 1668040 kB' 'Inactive(anon): 0 kB' 'Active(file): 180896 kB' 'Inactive(file): 166524 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1632352 kB' 'Mapped: 60440 kB' 'AnonPages: 386304 kB' 'Shmem: 1284932 kB' 'KernelStack: 12456 kB' 'PageTables: 4756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 140448 kB' 'Slab: 474268 kB' 'SReclaimable: 140448 kB' 'SUnreclaim: 333820 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.003 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:04:59.004 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682032 kB' 'MemFree: 49678996 kB' 'MemUsed: 11003036 kB' 'SwapCached: 0 kB' 'Active: 4987172 kB' 'Inactive: 3344284 kB' 'Active(anon): 4768304 kB' 'Inactive(anon): 0 kB' 'Active(file): 218868 kB' 'Inactive(file): 3344284 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8137172 kB' 'Mapped: 156932 kB' 'AnonPages: 194396 kB' 'Shmem: 4574020 kB' 'KernelStack: 12472 kB' 'PageTables: 3752 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127556 kB' 'Slab: 470972 kB' 'SReclaimable: 127556 kB' 'SUnreclaim: 343416 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.005 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:04:59.006 node0=512 expecting 512 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:04:59.006 node1=512 expecting 512 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:04:59.006 00:04:59.006 real 0m4.077s 00:04:59.006 user 0m1.531s 00:04:59.006 sys 0m2.620s 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:04:59.006 13:13:39 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:04:59.006 ************************************ 00:04:59.006 END TEST even_2G_alloc 00:04:59.006 ************************************ 00:04:59.006 13:13:39 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:04:59.006 13:13:39 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:04:59.006 13:13:39 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:04:59.006 13:13:39 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:04:59.006 ************************************ 00:04:59.006 START TEST odd_alloc 00:04:59.006 ************************************ 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # odd_alloc 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:04:59.006 13:13:39 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:03.209 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:03.209 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110250272 kB' 'MemAvailable: 113470504 kB' 'Buffers: 2704 kB' 'Cached: 9766924 kB' 'SwapCached: 0 kB' 'Active: 6838908 kB' 'Inactive: 3510808 kB' 'Active(anon): 6439144 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 583368 kB' 'Mapped: 217924 kB' 'Shmem: 5859056 kB' 'KReclaimable: 268004 kB' 'Slab: 944564 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 676560 kB' 'KernelStack: 24928 kB' 'PageTables: 8456 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7961112 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229676 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.209 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.210 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110248432 kB' 'MemAvailable: 113468664 kB' 'Buffers: 2704 kB' 'Cached: 9766924 kB' 'SwapCached: 0 kB' 'Active: 6840660 kB' 'Inactive: 3510808 kB' 'Active(anon): 6440896 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585180 kB' 'Mapped: 217900 kB' 'Shmem: 5859056 kB' 'KReclaimable: 268004 kB' 'Slab: 944592 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 676588 kB' 'KernelStack: 24944 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7962976 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229628 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.211 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.212 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110246416 kB' 'MemAvailable: 113466648 kB' 'Buffers: 2704 kB' 'Cached: 9766944 kB' 'SwapCached: 0 kB' 'Active: 6843208 kB' 'Inactive: 3510808 kB' 'Active(anon): 6443444 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 587700 kB' 'Mapped: 217900 kB' 'Shmem: 5859076 kB' 'KReclaimable: 268004 kB' 'Slab: 944592 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 676588 kB' 'KernelStack: 24944 kB' 'PageTables: 8536 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7966044 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229628 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.213 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.214 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:05:03.215 nr_hugepages=1025 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:03.215 resv_hugepages=0 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:03.215 surplus_hugepages=0 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:03.215 anon_hugepages=0 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110242512 kB' 'MemAvailable: 113462744 kB' 'Buffers: 2704 kB' 'Cached: 9766964 kB' 'SwapCached: 0 kB' 'Active: 6844060 kB' 'Inactive: 3510808 kB' 'Active(anon): 6444296 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 588488 kB' 'Mapped: 218264 kB' 'Shmem: 5859096 kB' 'KReclaimable: 268004 kB' 'Slab: 944592 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 676588 kB' 'KernelStack: 24912 kB' 'PageTables: 8444 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70511020 kB' 'Committed_AS: 7966068 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229632 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.215 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.216 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60557732 kB' 'MemUsed: 5104268 kB' 'SwapCached: 0 kB' 'Active: 1848696 kB' 'Inactive: 166524 kB' 'Active(anon): 1667800 kB' 'Inactive(anon): 0 kB' 'Active(file): 180896 kB' 'Inactive(file): 166524 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1632392 kB' 'Mapped: 60444 kB' 'AnonPages: 385976 kB' 'Shmem: 1284972 kB' 'KernelStack: 12424 kB' 'PageTables: 4660 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 140448 kB' 'Slab: 474056 kB' 'SReclaimable: 140448 kB' 'SUnreclaim: 333608 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.217 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.218 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682032 kB' 'MemFree: 49685360 kB' 'MemUsed: 10996672 kB' 'SwapCached: 0 kB' 'Active: 4989776 kB' 'Inactive: 3344284 kB' 'Active(anon): 4770908 kB' 'Inactive(anon): 0 kB' 'Active(file): 218868 kB' 'Inactive(file): 3344284 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8137296 kB' 'Mapped: 156952 kB' 'AnonPages: 196944 kB' 'Shmem: 4574144 kB' 'KernelStack: 12488 kB' 'PageTables: 3780 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127556 kB' 'Slab: 470536 kB' 'SReclaimable: 127556 kB' 'SUnreclaim: 342980 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.219 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:05:03.220 node0=512 expecting 513 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:05:03.220 node1=513 expecting 512 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:05:03.220 00:05:03.220 real 0m4.120s 00:05:03.220 user 0m1.528s 00:05:03.220 sys 0m2.656s 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:03.220 13:13:43 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:03.220 ************************************ 00:05:03.220 END TEST odd_alloc 00:05:03.220 ************************************ 00:05:03.220 13:13:43 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:05:03.220 13:13:43 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:03.220 13:13:43 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:03.220 13:13:43 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:03.220 ************************************ 00:05:03.220 START TEST custom_alloc 00:05:03.220 ************************************ 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # custom_alloc 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:05:03.220 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:03.221 13:13:43 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:07.429 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:07.429 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 109197272 kB' 'MemAvailable: 112417504 kB' 'Buffers: 2704 kB' 'Cached: 9767088 kB' 'SwapCached: 0 kB' 'Active: 6839696 kB' 'Inactive: 3510808 kB' 'Active(anon): 6439932 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 583908 kB' 'Mapped: 217436 kB' 'Shmem: 5859220 kB' 'KReclaimable: 268004 kB' 'Slab: 945108 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677104 kB' 'KernelStack: 24912 kB' 'PageTables: 8448 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7960568 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229772 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.429 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.430 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 109199624 kB' 'MemAvailable: 112419856 kB' 'Buffers: 2704 kB' 'Cached: 9767092 kB' 'SwapCached: 0 kB' 'Active: 6839992 kB' 'Inactive: 3510808 kB' 'Active(anon): 6440228 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 584276 kB' 'Mapped: 217436 kB' 'Shmem: 5859224 kB' 'KReclaimable: 268004 kB' 'Slab: 945092 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677088 kB' 'KernelStack: 24896 kB' 'PageTables: 8400 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7960584 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229724 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.431 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.432 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 109200084 kB' 'MemAvailable: 112420316 kB' 'Buffers: 2704 kB' 'Cached: 9767092 kB' 'SwapCached: 0 kB' 'Active: 6839400 kB' 'Inactive: 3510808 kB' 'Active(anon): 6439636 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 583644 kB' 'Mapped: 217416 kB' 'Shmem: 5859224 kB' 'KReclaimable: 268004 kB' 'Slab: 945120 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677116 kB' 'KernelStack: 24928 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7960608 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229724 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.433 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.434 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:05:07.435 nr_hugepages=1536 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:07.435 resv_hugepages=0 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:07.435 surplus_hugepages=0 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:07.435 anon_hugepages=0 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 109200360 kB' 'MemAvailable: 112420592 kB' 'Buffers: 2704 kB' 'Cached: 9767132 kB' 'SwapCached: 0 kB' 'Active: 6839444 kB' 'Inactive: 3510808 kB' 'Active(anon): 6439680 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 583644 kB' 'Mapped: 217416 kB' 'Shmem: 5859264 kB' 'KReclaimable: 268004 kB' 'Slab: 945120 kB' 'SReclaimable: 268004 kB' 'SUnreclaim: 677116 kB' 'KernelStack: 24928 kB' 'PageTables: 8500 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 69987756 kB' 'Committed_AS: 7960628 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229724 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.435 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.436 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 60562164 kB' 'MemUsed: 5099836 kB' 'SwapCached: 0 kB' 'Active: 1848720 kB' 'Inactive: 166524 kB' 'Active(anon): 1667824 kB' 'Inactive(anon): 0 kB' 'Active(file): 180896 kB' 'Inactive(file): 166524 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1632392 kB' 'Mapped: 60440 kB' 'AnonPages: 385944 kB' 'Shmem: 1284972 kB' 'KernelStack: 12456 kB' 'PageTables: 4756 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 140448 kB' 'Slab: 474236 kB' 'SReclaimable: 140448 kB' 'SUnreclaim: 333788 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.437 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.438 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60682032 kB' 'MemFree: 48639180 kB' 'MemUsed: 12042852 kB' 'SwapCached: 0 kB' 'Active: 4990796 kB' 'Inactive: 3344284 kB' 'Active(anon): 4771928 kB' 'Inactive(anon): 0 kB' 'Active(file): 218868 kB' 'Inactive(file): 3344284 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8137488 kB' 'Mapped: 156976 kB' 'AnonPages: 197708 kB' 'Shmem: 4574336 kB' 'KernelStack: 12472 kB' 'PageTables: 3744 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 127556 kB' 'Slab: 470884 kB' 'SReclaimable: 127556 kB' 'SUnreclaim: 343328 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.439 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:05:07.440 node0=512 expecting 512 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:07.440 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:07.441 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:07.441 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:05:07.441 node1=1024 expecting 1024 00:05:07.441 13:13:47 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:05:07.441 00:05:07.441 real 0m4.156s 00:05:07.441 user 0m1.631s 00:05:07.441 sys 0m2.602s 00:05:07.441 13:13:47 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:07.441 13:13:47 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:07.441 ************************************ 00:05:07.441 END TEST custom_alloc 00:05:07.441 ************************************ 00:05:07.441 13:13:47 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:05:07.441 13:13:47 setup.sh.hugepages -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:07.441 13:13:47 setup.sh.hugepages -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:07.441 13:13:47 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:07.441 ************************************ 00:05:07.441 START TEST no_shrink_alloc 00:05:07.441 ************************************ 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # no_shrink_alloc 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:07.441 13:13:47 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:11.660 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:11.660 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110232436 kB' 'MemAvailable: 113452652 kB' 'Buffers: 2704 kB' 'Cached: 9767272 kB' 'SwapCached: 0 kB' 'Active: 6840368 kB' 'Inactive: 3510808 kB' 'Active(anon): 6440604 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 584068 kB' 'Mapped: 217564 kB' 'Shmem: 5859404 kB' 'KReclaimable: 267972 kB' 'Slab: 945368 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677396 kB' 'KernelStack: 24912 kB' 'PageTables: 8480 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7963296 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229612 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.660 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.661 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110231136 kB' 'MemAvailable: 113451352 kB' 'Buffers: 2704 kB' 'Cached: 9767276 kB' 'SwapCached: 0 kB' 'Active: 6841016 kB' 'Inactive: 3510808 kB' 'Active(anon): 6441252 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 584752 kB' 'Mapped: 217564 kB' 'Shmem: 5859408 kB' 'KReclaimable: 267972 kB' 'Slab: 945368 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677396 kB' 'KernelStack: 24960 kB' 'PageTables: 8764 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7965044 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229660 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.662 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.663 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110230552 kB' 'MemAvailable: 113450768 kB' 'Buffers: 2704 kB' 'Cached: 9767292 kB' 'SwapCached: 0 kB' 'Active: 6840504 kB' 'Inactive: 3510808 kB' 'Active(anon): 6440740 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 584680 kB' 'Mapped: 217468 kB' 'Shmem: 5859424 kB' 'KReclaimable: 267972 kB' 'Slab: 945352 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677380 kB' 'KernelStack: 25008 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7965064 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229772 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.664 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.665 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:11.666 nr_hugepages=1024 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:11.666 resv_hugepages=0 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:11.666 surplus_hugepages=0 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:11.666 anon_hugepages=0 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110229812 kB' 'MemAvailable: 113450028 kB' 'Buffers: 2704 kB' 'Cached: 9767316 kB' 'SwapCached: 0 kB' 'Active: 6840396 kB' 'Inactive: 3510808 kB' 'Active(anon): 6440632 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 584524 kB' 'Mapped: 217468 kB' 'Shmem: 5859448 kB' 'KReclaimable: 267972 kB' 'Slab: 945352 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677380 kB' 'KernelStack: 25024 kB' 'PageTables: 8676 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7965088 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229772 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.666 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.667 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:11.668 13:13:51 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 59503608 kB' 'MemUsed: 6158392 kB' 'SwapCached: 0 kB' 'Active: 1849544 kB' 'Inactive: 166524 kB' 'Active(anon): 1668648 kB' 'Inactive(anon): 0 kB' 'Active(file): 180896 kB' 'Inactive(file): 166524 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1632424 kB' 'Mapped: 60440 kB' 'AnonPages: 386760 kB' 'Shmem: 1285004 kB' 'KernelStack: 12536 kB' 'PageTables: 5040 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 140448 kB' 'Slab: 474596 kB' 'SReclaimable: 140448 kB' 'SUnreclaim: 334148 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.668 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.669 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:11.670 node0=1024 expecting 1024 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:05:11.670 13:13:52 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:05:15.922 0000:80:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:80:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:80:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:80:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:80:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:80:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:80:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:80:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:00:01.6 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:65:00.0 (8086 0a54): Already using the vfio-pci driver 00:05:15.922 0000:00:01.7 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:00:01.4 (8086 0b00): Already using the vfio-pci driver 00:05:15.922 0000:00:01.5 (8086 0b00): Already using the vfio-pci driver 00:05:15.923 0000:00:01.2 (8086 0b00): Already using the vfio-pci driver 00:05:15.923 0000:00:01.3 (8086 0b00): Already using the vfio-pci driver 00:05:15.923 0000:00:01.0 (8086 0b00): Already using the vfio-pci driver 00:05:15.923 0000:00:01.1 (8086 0b00): Already using the vfio-pci driver 00:05:15.923 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110227752 kB' 'MemAvailable: 113447968 kB' 'Buffers: 2704 kB' 'Cached: 9767432 kB' 'SwapCached: 0 kB' 'Active: 6842036 kB' 'Inactive: 3510808 kB' 'Active(anon): 6442272 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585984 kB' 'Mapped: 217512 kB' 'Shmem: 5859564 kB' 'KReclaimable: 267972 kB' 'Slab: 945752 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677780 kB' 'KernelStack: 25088 kB' 'PageTables: 8992 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7966320 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229852 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.923 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:15.924 13:13:55 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:15.924 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:15.924 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.924 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.924 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110226100 kB' 'MemAvailable: 113446316 kB' 'Buffers: 2704 kB' 'Cached: 9767436 kB' 'SwapCached: 0 kB' 'Active: 6842484 kB' 'Inactive: 3510808 kB' 'Active(anon): 6442720 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585996 kB' 'Mapped: 217576 kB' 'Shmem: 5859568 kB' 'KReclaimable: 267972 kB' 'Slab: 945776 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677804 kB' 'KernelStack: 25136 kB' 'PageTables: 8944 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7966340 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229788 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:15.924 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.924 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.924 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.925 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.926 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110227076 kB' 'MemAvailable: 113447292 kB' 'Buffers: 2704 kB' 'Cached: 9767452 kB' 'SwapCached: 0 kB' 'Active: 6841500 kB' 'Inactive: 3510808 kB' 'Active(anon): 6441736 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585476 kB' 'Mapped: 217480 kB' 'Shmem: 5859584 kB' 'KReclaimable: 267972 kB' 'Slab: 945768 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677796 kB' 'KernelStack: 25088 kB' 'PageTables: 9088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7966360 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229884 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.927 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.928 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:05:15.929 nr_hugepages=1024 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:05:15.929 resv_hugepages=0 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:05:15.929 surplus_hugepages=0 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:05:15.929 anon_hugepages=0 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 126344032 kB' 'MemFree: 110226880 kB' 'MemAvailable: 113447096 kB' 'Buffers: 2704 kB' 'Cached: 9767452 kB' 'SwapCached: 0 kB' 'Active: 6841608 kB' 'Inactive: 3510808 kB' 'Active(anon): 6441844 kB' 'Inactive(anon): 0 kB' 'Active(file): 399764 kB' 'Inactive(file): 3510808 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 585580 kB' 'Mapped: 217480 kB' 'Shmem: 5859584 kB' 'KReclaimable: 267972 kB' 'Slab: 945768 kB' 'SReclaimable: 267972 kB' 'SUnreclaim: 677796 kB' 'KernelStack: 25184 kB' 'PageTables: 9152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 70512044 kB' 'Committed_AS: 7966384 kB' 'VmallocTotal: 13743895347199 kB' 'VmallocUsed: 229868 kB' 'VmallocChunk: 0 kB' 'Percpu: 94208 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2814244 kB' 'DirectMap2M: 18886656 kB' 'DirectMap1G: 114294784 kB' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.929 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.930 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 65662000 kB' 'MemFree: 59510940 kB' 'MemUsed: 6151060 kB' 'SwapCached: 0 kB' 'Active: 1851020 kB' 'Inactive: 166524 kB' 'Active(anon): 1670124 kB' 'Inactive(anon): 0 kB' 'Active(file): 180896 kB' 'Inactive(file): 166524 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1632460 kB' 'Mapped: 60440 kB' 'AnonPages: 388224 kB' 'Shmem: 1285040 kB' 'KernelStack: 12680 kB' 'PageTables: 5096 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 140448 kB' 'Slab: 474912 kB' 'SReclaimable: 140448 kB' 'SUnreclaim: 334464 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.931 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.932 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:05:15.933 node0=1024 expecting 1024 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:05:15.933 00:05:15.933 real 0m8.233s 00:05:15.933 user 0m3.256s 00:05:15.933 sys 0m5.129s 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:15.933 13:13:56 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:05:15.933 ************************************ 00:05:15.933 END TEST no_shrink_alloc 00:05:15.933 ************************************ 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:05:15.933 13:13:56 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:05:15.933 00:05:15.933 real 0m31.448s 00:05:15.933 user 0m11.628s 00:05:15.933 sys 0m18.593s 00:05:15.933 13:13:56 setup.sh.hugepages -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:15.933 13:13:56 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:05:15.933 ************************************ 00:05:15.933 END TEST hugepages 00:05:15.933 ************************************ 00:05:15.933 13:13:56 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:15.933 13:13:56 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:15.933 13:13:56 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:15.933 13:13:56 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:15.933 ************************************ 00:05:15.933 START TEST driver 00:05:15.933 ************************************ 00:05:15.933 13:13:56 setup.sh.driver -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:05:15.933 * Looking for test storage... 00:05:15.933 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:15.933 13:13:56 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:05:15.933 13:13:56 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:15.933 13:13:56 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:21.220 13:14:01 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:05:21.220 13:14:01 setup.sh.driver -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:21.220 13:14:01 setup.sh.driver -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:21.220 13:14:01 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:21.220 ************************************ 00:05:21.220 START TEST guess_driver 00:05:21.220 ************************************ 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # guess_driver 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 370 > 0 )) 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:05:21.220 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:21.220 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:21.220 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:05:21.220 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:05:21.220 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:05:21.220 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:05:21.220 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:05:21.220 Looking for driver=vfio-pci 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:05:21.220 13:14:01 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:25.484 13:14:05 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.869 13:14:07 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:05:26.869 13:14:07 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:05:26.869 13:14:07 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:05:26.869 13:14:07 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:05:26.869 13:14:07 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:05:26.869 13:14:07 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:26.869 13:14:07 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:32.156 00:05:32.156 real 0m10.905s 00:05:32.156 user 0m2.898s 00:05:32.156 sys 0m5.477s 00:05:32.156 13:14:12 setup.sh.driver.guess_driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:32.156 13:14:12 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:05:32.156 ************************************ 00:05:32.156 END TEST guess_driver 00:05:32.156 ************************************ 00:05:32.156 00:05:32.156 real 0m16.374s 00:05:32.156 user 0m4.509s 00:05:32.156 sys 0m8.517s 00:05:32.156 13:14:12 setup.sh.driver -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:32.156 13:14:12 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:05:32.156 ************************************ 00:05:32.156 END TEST driver 00:05:32.156 ************************************ 00:05:32.156 13:14:12 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:32.156 13:14:12 setup.sh -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:32.156 13:14:12 setup.sh -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:32.156 13:14:12 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:05:32.156 ************************************ 00:05:32.156 START TEST devices 00:05:32.156 ************************************ 00:05:32.156 13:14:12 setup.sh.devices -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:05:32.156 * Looking for test storage... 00:05:32.156 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:05:32.156 13:14:12 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:05:32.156 13:14:12 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:05:32.156 13:14:12 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:05:32.156 13:14:12 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:05:37.445 13:14:17 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1669 -- # zoned_devs=() 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1669 -- # local -gA zoned_devs 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1670 -- # local nvme bdf 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1672 -- # for nvme in /sys/block/nvme* 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1673 -- # is_block_zoned nvme0n1 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1662 -- # local device=nvme0n1 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:05:37.445 13:14:17 setup.sh.devices -- common/autotest_common.sh@1665 -- # [[ none != none ]] 00:05:37.445 13:14:17 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:65:00.0 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@203 -- # [[ '' == *\0\0\0\0\:\6\5\:\0\0\.\0* ]] 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme0n1 00:05:37.446 13:14:17 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme0n1 pt 00:05:37.446 13:14:17 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme0n1 00:05:37.446 No valid GPT data, bailing 00:05:37.446 13:14:17 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:05:37.446 13:14:17 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:05:37.446 13:14:17 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme0n1 00:05:37.446 13:14:17 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme0n1 00:05:37.446 13:14:17 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme0n1 ]] 00:05:37.446 13:14:17 setup.sh.devices -- setup/common.sh@80 -- # echo 2000398934016 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@204 -- # (( 2000398934016 >= min_disk_size )) 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:65:00.0 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme0n1 00:05:37.446 13:14:17 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:05:37.446 13:14:17 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:37.446 13:14:17 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:37.446 13:14:17 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:37.446 ************************************ 00:05:37.446 START TEST nvme_mount 00:05:37.446 ************************************ 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # nvme_mount 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme0n1 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme0n1p1 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme0n1 1 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:37.446 13:14:17 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 00:05:37.707 Creating new GPT entries in memory. 00:05:37.707 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:37.707 other utilities. 00:05:37.707 13:14:18 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:37.707 13:14:18 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:37.707 13:14:18 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:37.707 13:14:18 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:37.707 13:14:18 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:38.649 Creating new GPT entries in memory. 00:05:38.649 The operation has completed successfully. 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 800759 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1p1 ]] 00:05:38.649 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1p1 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:65:00.0 nvme0n1:nvme0n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1p1 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:38.909 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:38.910 13:14:19 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1\p\1* ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:05:43.112 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:43.112 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:05:43.112 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:05:43.112 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:05:43.112 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme0n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme0n1 ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme0n1 1024M 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:65:00.0 nvme0n1:nvme0n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme0n1 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:43.112 13:14:23 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme0n1:nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\0\n\1* ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:65:00.0 data@nvme0n1 '' '' 00:05:47.313 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme0n1 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:47.314 13:14:27 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.613 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme0n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\0\n\1* ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:50.874 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:05:51.134 /dev/nvme0n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:05:51.134 00:05:51.134 real 0m14.344s 00:05:51.134 user 0m4.506s 00:05:51.134 sys 0m7.721s 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:05:51.134 13:14:31 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:05:51.134 ************************************ 00:05:51.134 END TEST nvme_mount 00:05:51.134 ************************************ 00:05:51.134 13:14:31 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:05:51.134 13:14:31 setup.sh.devices -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:05:51.134 13:14:31 setup.sh.devices -- common/autotest_common.sh@1107 -- # xtrace_disable 00:05:51.134 13:14:31 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:05:51.134 ************************************ 00:05:51.134 START TEST dm_mount 00:05:51.134 ************************************ 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # dm_mount 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme0n1 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme0n1p1 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme0n1p2 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme0n1 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme0n1 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme0n1 --zap-all 00:05:51.134 13:14:31 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme0n1p1 nvme0n1p2 00:05:52.076 Creating new GPT entries in memory. 00:05:52.076 GPT data structures destroyed! You may now partition the disk using fdisk or 00:05:52.076 other utilities. 00:05:52.076 13:14:32 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:05:52.076 13:14:32 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:52.076 13:14:32 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:52.076 13:14:32 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:52.076 13:14:32 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=1:2048:2099199 00:05:53.477 Creating new GPT entries in memory. 00:05:53.477 The operation has completed successfully. 00:05:53.477 13:14:33 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:53.477 13:14:33 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:53.477 13:14:33 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:05:53.477 13:14:33 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:05:53.477 13:14:33 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme0n1 sgdisk /dev/nvme0n1 --new=2:2099200:4196351 00:05:54.117 The operation has completed successfully. 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 805884 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.117 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme0n1p1/holders/dm-0 ]] 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme0n1p2/holders/dm-0 ]] 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:65:00.0 nvme0n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme0n1:nvme_dm_test 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:54.378 13:14:34 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:54.379 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:54.379 13:14:34 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0,mount@nvme0n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\0\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:65:00.0 holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 '' '' 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:65:00.0 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:65:00.0 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:05:58.586 13:14:38 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:65:00.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme0n1p1:dm-0,holder@nvme0n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\0\n\1\p\2\:\d\m\-\0* ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.6 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.7 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.4 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.5 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.2 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.890 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.3 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.891 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.891 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.0 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.891 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:01.891 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:01.1 == \0\0\0\0\:\6\5\:\0\0\.\0 ]] 00:06:01.891 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme0n1p1 00:06:02.152 /dev/nvme0n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme0n1p2 00:06:02.152 00:06:02.152 real 0m11.072s 00:06:02.152 user 0m2.992s 00:06:02.152 sys 0m5.146s 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:02.152 13:14:42 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:06:02.152 ************************************ 00:06:02.152 END TEST dm_mount 00:06:02.152 ************************************ 00:06:02.152 13:14:42 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:06:02.152 13:14:42 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:06:02.152 13:14:42 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:06:02.152 13:14:42 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:02.152 13:14:42 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme0n1p1 00:06:02.152 13:14:42 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme0n1 ]] 00:06:02.152 13:14:42 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme0n1 00:06:02.413 /dev/nvme0n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:06:02.413 /dev/nvme0n1: 8 bytes were erased at offset 0x1d1c1115e00 (gpt): 45 46 49 20 50 41 52 54 00:06:02.413 /dev/nvme0n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:06:02.413 /dev/nvme0n1: calling ioctl to re-read partition table: Success 00:06:02.413 13:14:43 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:06:02.413 13:14:43 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:06:02.413 13:14:43 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:06:02.413 13:14:43 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme0n1p1 ]] 00:06:02.413 13:14:43 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme0n1p2 ]] 00:06:02.413 13:14:43 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme0n1 ]] 00:06:02.413 13:14:43 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme0n1 00:06:02.413 00:06:02.413 real 0m30.468s 00:06:02.413 user 0m9.346s 00:06:02.413 sys 0m15.965s 00:06:02.413 13:14:43 setup.sh.devices -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:02.413 13:14:43 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:06:02.413 ************************************ 00:06:02.413 END TEST devices 00:06:02.413 ************************************ 00:06:02.673 00:06:02.674 real 1m47.164s 00:06:02.674 user 0m35.096s 00:06:02.674 sys 1m0.093s 00:06:02.674 13:14:43 setup.sh -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:02.674 13:14:43 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:06:02.674 ************************************ 00:06:02.674 END TEST setup.sh 00:06:02.674 ************************************ 00:06:02.674 13:14:43 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:06:06.880 Hugepages 00:06:06.880 node hugesize free / total 00:06:06.880 node0 1048576kB 0 / 0 00:06:06.880 node0 2048kB 1024 / 1024 00:06:06.880 node1 1048576kB 0 / 0 00:06:06.880 node1 2048kB 1024 / 1024 00:06:06.880 00:06:06.880 Type BDF Vendor Device NUMA Driver Device Block devices 00:06:06.880 I/OAT 0000:00:01.0 8086 0b00 0 ioatdma - - 00:06:06.880 I/OAT 0000:00:01.1 8086 0b00 0 ioatdma - - 00:06:06.880 I/OAT 0000:00:01.2 8086 0b00 0 ioatdma - - 00:06:06.880 I/OAT 0000:00:01.3 8086 0b00 0 ioatdma - - 00:06:06.880 I/OAT 0000:00:01.4 8086 0b00 0 ioatdma - - 00:06:06.880 I/OAT 0000:00:01.5 8086 0b00 0 ioatdma - - 00:06:06.880 I/OAT 0000:00:01.6 8086 0b00 0 ioatdma - - 00:06:06.880 I/OAT 0000:00:01.7 8086 0b00 0 ioatdma - - 00:06:06.880 NVMe 0000:65:00.0 8086 0a54 0 nvme nvme0 nvme0n1 00:06:06.880 I/OAT 0000:80:01.0 8086 0b00 1 ioatdma - - 00:06:06.880 I/OAT 0000:80:01.1 8086 0b00 1 ioatdma - - 00:06:06.880 I/OAT 0000:80:01.2 8086 0b00 1 ioatdma - - 00:06:06.880 I/OAT 0000:80:01.3 8086 0b00 1 ioatdma - - 00:06:06.880 I/OAT 0000:80:01.4 8086 0b00 1 ioatdma - - 00:06:06.880 I/OAT 0000:80:01.5 8086 0b00 1 ioatdma - - 00:06:06.880 I/OAT 0000:80:01.6 8086 0b00 1 ioatdma - - 00:06:06.880 I/OAT 0000:80:01.7 8086 0b00 1 ioatdma - - 00:06:06.880 13:14:47 -- spdk/autotest.sh@130 -- # uname -s 00:06:06.880 13:14:47 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:06:06.880 13:14:47 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:06:06.880 13:14:47 -- common/autotest_common.sh@1531 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:11.082 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:06:11.082 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:06:12.990 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:06:12.990 13:14:53 -- common/autotest_common.sh@1532 -- # sleep 1 00:06:13.932 13:14:54 -- common/autotest_common.sh@1533 -- # bdfs=() 00:06:13.932 13:14:54 -- common/autotest_common.sh@1533 -- # local bdfs 00:06:13.932 13:14:54 -- common/autotest_common.sh@1534 -- # bdfs=($(get_nvme_bdfs)) 00:06:13.932 13:14:54 -- common/autotest_common.sh@1534 -- # get_nvme_bdfs 00:06:13.932 13:14:54 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:13.932 13:14:54 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:13.932 13:14:54 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:13.932 13:14:54 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:13.932 13:14:54 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:13.932 13:14:54 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:13.932 13:14:54 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:06:13.932 13:14:54 -- common/autotest_common.sh@1536 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:06:18.135 Waiting for block devices as requested 00:06:18.135 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:06:18.135 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:06:18.135 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:06:18.135 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:06:18.135 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:06:18.135 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:06:18.135 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:06:18.395 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:06:18.395 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:06:18.656 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:06:18.656 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:06:18.656 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:06:18.916 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:06:18.916 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:06:18.916 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:06:19.177 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:06:19.177 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:06:19.177 13:14:59 -- common/autotest_common.sh@1538 -- # for bdf in "${bdfs[@]}" 00:06:19.177 13:14:59 -- common/autotest_common.sh@1539 -- # get_nvme_ctrlr_from_bdf 0000:65:00.0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1502 -- # readlink -f /sys/class/nvme/nvme0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1502 -- # grep 0000:65:00.0/nvme/nvme 00:06:19.177 13:14:59 -- common/autotest_common.sh@1502 -- # bdf_sysfs_path=/sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1503 -- # [[ -z /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 ]] 00:06:19.177 13:14:59 -- common/autotest_common.sh@1507 -- # basename /sys/devices/pci0000:64/0000:64:02.0/0000:65:00.0/nvme/nvme0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1507 -- # printf '%s\n' nvme0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1539 -- # nvme_ctrlr=/dev/nvme0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1540 -- # [[ -z /dev/nvme0 ]] 00:06:19.177 13:14:59 -- common/autotest_common.sh@1545 -- # nvme id-ctrl /dev/nvme0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1545 -- # grep oacs 00:06:19.177 13:14:59 -- common/autotest_common.sh@1545 -- # cut -d: -f2 00:06:19.177 13:14:59 -- common/autotest_common.sh@1545 -- # oacs=' 0xe' 00:06:19.177 13:14:59 -- common/autotest_common.sh@1546 -- # oacs_ns_manage=8 00:06:19.177 13:14:59 -- common/autotest_common.sh@1548 -- # [[ 8 -ne 0 ]] 00:06:19.177 13:14:59 -- common/autotest_common.sh@1554 -- # nvme id-ctrl /dev/nvme0 00:06:19.177 13:14:59 -- common/autotest_common.sh@1554 -- # grep unvmcap 00:06:19.177 13:14:59 -- common/autotest_common.sh@1554 -- # cut -d: -f2 00:06:19.177 13:14:59 -- common/autotest_common.sh@1554 -- # unvmcap=' 0' 00:06:19.177 13:14:59 -- common/autotest_common.sh@1555 -- # [[ 0 -eq 0 ]] 00:06:19.177 13:14:59 -- common/autotest_common.sh@1557 -- # continue 00:06:19.177 13:14:59 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:06:19.177 13:14:59 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:19.177 13:14:59 -- common/autotest_common.sh@10 -- # set +x 00:06:19.177 13:14:59 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:06:19.177 13:14:59 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:19.177 13:14:59 -- common/autotest_common.sh@10 -- # set +x 00:06:19.177 13:14:59 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:06:23.381 0000:80:01.6 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:80:01.7 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:80:01.4 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:80:01.5 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:80:01.2 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:80:01.3 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:80:01.0 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:80:01.1 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.6 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.7 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.4 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.5 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.2 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.3 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.0 (8086 0b00): ioatdma -> vfio-pci 00:06:23.381 0000:00:01.1 (8086 0b00): ioatdma -> vfio-pci 00:06:25.292 0000:65:00.0 (8086 0a54): nvme -> vfio-pci 00:06:25.292 13:15:05 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:06:25.292 13:15:05 -- common/autotest_common.sh@730 -- # xtrace_disable 00:06:25.292 13:15:05 -- common/autotest_common.sh@10 -- # set +x 00:06:25.292 13:15:05 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:06:25.292 13:15:05 -- common/autotest_common.sh@1591 -- # mapfile -t bdfs 00:06:25.292 13:15:05 -- common/autotest_common.sh@1591 -- # get_nvme_bdfs_by_id 0x0a54 00:06:25.292 13:15:05 -- common/autotest_common.sh@1577 -- # bdfs=() 00:06:25.292 13:15:05 -- common/autotest_common.sh@1577 -- # local bdfs 00:06:25.292 13:15:05 -- common/autotest_common.sh@1579 -- # get_nvme_bdfs 00:06:25.292 13:15:05 -- common/autotest_common.sh@1513 -- # bdfs=() 00:06:25.292 13:15:05 -- common/autotest_common.sh@1513 -- # local bdfs 00:06:25.292 13:15:05 -- common/autotest_common.sh@1514 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:06:25.292 13:15:05 -- common/autotest_common.sh@1514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:06:25.292 13:15:05 -- common/autotest_common.sh@1514 -- # jq -r '.config[].params.traddr' 00:06:25.292 13:15:05 -- common/autotest_common.sh@1515 -- # (( 1 == 0 )) 00:06:25.292 13:15:05 -- common/autotest_common.sh@1519 -- # printf '%s\n' 0000:65:00.0 00:06:25.292 13:15:05 -- common/autotest_common.sh@1579 -- # for bdf in $(get_nvme_bdfs) 00:06:25.292 13:15:05 -- common/autotest_common.sh@1580 -- # cat /sys/bus/pci/devices/0000:65:00.0/device 00:06:25.292 13:15:05 -- common/autotest_common.sh@1580 -- # device=0x0a54 00:06:25.292 13:15:05 -- common/autotest_common.sh@1581 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:06:25.292 13:15:05 -- common/autotest_common.sh@1582 -- # bdfs+=($bdf) 00:06:25.292 13:15:05 -- common/autotest_common.sh@1586 -- # printf '%s\n' 0000:65:00.0 00:06:25.292 13:15:05 -- common/autotest_common.sh@1592 -- # [[ -z 0000:65:00.0 ]] 00:06:25.292 13:15:05 -- common/autotest_common.sh@1597 -- # spdk_tgt_pid=817213 00:06:25.292 13:15:05 -- common/autotest_common.sh@1598 -- # waitforlisten 817213 00:06:25.292 13:15:05 -- common/autotest_common.sh@1596 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:25.292 13:15:05 -- common/autotest_common.sh@831 -- # '[' -z 817213 ']' 00:06:25.292 13:15:05 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:25.292 13:15:05 -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:25.292 13:15:05 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:25.292 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:25.292 13:15:05 -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:25.292 13:15:05 -- common/autotest_common.sh@10 -- # set +x 00:06:25.292 [2024-07-25 13:15:06.003150] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:06:25.292 [2024-07-25 13:15:06.003217] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid817213 ] 00:06:25.552 [2024-07-25 13:15:06.091225] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:25.552 [2024-07-25 13:15:06.186514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:26.121 13:15:06 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:26.121 13:15:06 -- common/autotest_common.sh@864 -- # return 0 00:06:26.121 13:15:06 -- common/autotest_common.sh@1600 -- # bdf_id=0 00:06:26.121 13:15:06 -- common/autotest_common.sh@1601 -- # for bdf in "${bdfs[@]}" 00:06:26.121 13:15:06 -- common/autotest_common.sh@1602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:65:00.0 00:06:29.421 nvme0n1 00:06:29.421 13:15:09 -- common/autotest_common.sh@1604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:06:29.421 [2024-07-25 13:15:10.081002] vbdev_opal_rpc.c: 125:rpc_bdev_nvme_opal_revert: *ERROR*: nvme0 not support opal 00:06:29.421 request: 00:06:29.421 { 00:06:29.421 "nvme_ctrlr_name": "nvme0", 00:06:29.421 "password": "test", 00:06:29.421 "method": "bdev_nvme_opal_revert", 00:06:29.421 "req_id": 1 00:06:29.421 } 00:06:29.421 Got JSON-RPC error response 00:06:29.421 response: 00:06:29.421 { 00:06:29.421 "code": -32602, 00:06:29.421 "message": "Invalid parameters" 00:06:29.421 } 00:06:29.421 13:15:10 -- common/autotest_common.sh@1604 -- # true 00:06:29.421 13:15:10 -- common/autotest_common.sh@1605 -- # (( ++bdf_id )) 00:06:29.421 13:15:10 -- common/autotest_common.sh@1608 -- # killprocess 817213 00:06:29.421 13:15:10 -- common/autotest_common.sh@950 -- # '[' -z 817213 ']' 00:06:29.421 13:15:10 -- common/autotest_common.sh@954 -- # kill -0 817213 00:06:29.421 13:15:10 -- common/autotest_common.sh@955 -- # uname 00:06:29.421 13:15:10 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:29.421 13:15:10 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 817213 00:06:29.421 13:15:10 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:29.421 13:15:10 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:29.421 13:15:10 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 817213' 00:06:29.421 killing process with pid 817213 00:06:29.421 13:15:10 -- common/autotest_common.sh@969 -- # kill 817213 00:06:29.421 13:15:10 -- common/autotest_common.sh@974 -- # wait 817213 00:06:32.029 13:15:12 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:06:32.029 13:15:12 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:06:32.029 13:15:12 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:06:32.029 13:15:12 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:06:32.029 13:15:12 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:06:32.601 Restarting all devices. 00:06:36.801 lstat() error: No such file or directory 00:06:36.801 QAT Error: No GENERAL section found 00:06:36.801 Failed to configure qat_dev0 00:06:36.801 lstat() error: No such file or directory 00:06:36.801 QAT Error: No GENERAL section found 00:06:36.801 Failed to configure qat_dev1 00:06:36.801 lstat() error: No such file or directory 00:06:36.801 QAT Error: No GENERAL section found 00:06:36.801 Failed to configure qat_dev2 00:06:36.801 enable sriov 00:06:36.801 Checking status of all devices. 00:06:36.801 There is 3 QAT acceleration device(s) in the system: 00:06:36.801 qat_dev0 - type: c6xx, inst_id: 0, node_id: 1, bsf: 0000:cc:00.0, #accel: 5 #engines: 10 state: down 00:06:36.801 qat_dev1 - type: c6xx, inst_id: 1, node_id: 1, bsf: 0000:ce:00.0, #accel: 5 #engines: 10 state: down 00:06:36.801 qat_dev2 - type: c6xx, inst_id: 2, node_id: 1, bsf: 0000:d0:00.0, #accel: 5 #engines: 10 state: down 00:06:36.801 0000:cc:00.0 set to 16 VFs 00:06:37.061 0000:ce:00.0 set to 16 VFs 00:06:37.632 0000:d0:00.0 set to 16 VFs 00:06:37.632 Properly configured the qat device with driver uio_pci_generic. 00:06:37.632 13:15:18 -- spdk/autotest.sh@162 -- # timing_enter lib 00:06:37.632 13:15:18 -- common/autotest_common.sh@724 -- # xtrace_disable 00:06:37.632 13:15:18 -- common/autotest_common.sh@10 -- # set +x 00:06:37.893 13:15:18 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:06:37.893 13:15:18 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:37.893 13:15:18 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:37.893 13:15:18 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:37.893 13:15:18 -- common/autotest_common.sh@10 -- # set +x 00:06:37.893 ************************************ 00:06:37.893 START TEST env 00:06:37.893 ************************************ 00:06:37.893 13:15:18 env -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:06:37.893 * Looking for test storage... 00:06:37.894 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:06:37.894 13:15:18 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:37.894 13:15:18 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:37.894 13:15:18 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:37.894 13:15:18 env -- common/autotest_common.sh@10 -- # set +x 00:06:37.894 ************************************ 00:06:37.894 START TEST env_memory 00:06:37.894 ************************************ 00:06:37.894 13:15:18 env.env_memory -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:06:37.894 00:06:37.894 00:06:37.894 CUnit - A unit testing framework for C - Version 2.1-3 00:06:37.894 http://cunit.sourceforge.net/ 00:06:37.894 00:06:37.894 00:06:37.894 Suite: memory 00:06:37.894 Test: alloc and free memory map ...[2024-07-25 13:15:18.664237] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:06:37.894 passed 00:06:38.154 Test: mem map translation ...[2024-07-25 13:15:18.687993] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:06:38.154 [2024-07-25 13:15:18.688024] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:06:38.154 [2024-07-25 13:15:18.688068] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:06:38.154 [2024-07-25 13:15:18.688076] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:06:38.154 passed 00:06:38.154 Test: mem map registration ...[2024-07-25 13:15:18.739147] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:06:38.154 [2024-07-25 13:15:18.739169] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:06:38.154 passed 00:06:38.155 Test: mem map adjacent registrations ...passed 00:06:38.155 00:06:38.155 Run Summary: Type Total Ran Passed Failed Inactive 00:06:38.155 suites 1 1 n/a 0 0 00:06:38.155 tests 4 4 4 0 0 00:06:38.155 asserts 152 152 152 0 n/a 00:06:38.155 00:06:38.155 Elapsed time = 0.180 seconds 00:06:38.155 00:06:38.155 real 0m0.194s 00:06:38.155 user 0m0.186s 00:06:38.155 sys 0m0.007s 00:06:38.155 13:15:18 env.env_memory -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:38.155 13:15:18 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:06:38.155 ************************************ 00:06:38.155 END TEST env_memory 00:06:38.155 ************************************ 00:06:38.155 13:15:18 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:38.155 13:15:18 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:38.155 13:15:18 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:38.155 13:15:18 env -- common/autotest_common.sh@10 -- # set +x 00:06:38.155 ************************************ 00:06:38.155 START TEST env_vtophys 00:06:38.155 ************************************ 00:06:38.155 13:15:18 env.env_vtophys -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:06:38.155 EAL: lib.eal log level changed from notice to debug 00:06:38.155 EAL: Detected lcore 0 as core 0 on socket 0 00:06:38.155 EAL: Detected lcore 1 as core 1 on socket 0 00:06:38.155 EAL: Detected lcore 2 as core 2 on socket 0 00:06:38.155 EAL: Detected lcore 3 as core 3 on socket 0 00:06:38.155 EAL: Detected lcore 4 as core 4 on socket 0 00:06:38.155 EAL: Detected lcore 5 as core 5 on socket 0 00:06:38.155 EAL: Detected lcore 6 as core 6 on socket 0 00:06:38.155 EAL: Detected lcore 7 as core 7 on socket 0 00:06:38.155 EAL: Detected lcore 8 as core 8 on socket 0 00:06:38.155 EAL: Detected lcore 9 as core 9 on socket 0 00:06:38.155 EAL: Detected lcore 10 as core 10 on socket 0 00:06:38.155 EAL: Detected lcore 11 as core 11 on socket 0 00:06:38.155 EAL: Detected lcore 12 as core 12 on socket 0 00:06:38.155 EAL: Detected lcore 13 as core 13 on socket 0 00:06:38.155 EAL: Detected lcore 14 as core 14 on socket 0 00:06:38.155 EAL: Detected lcore 15 as core 15 on socket 0 00:06:38.155 EAL: Detected lcore 16 as core 16 on socket 0 00:06:38.155 EAL: Detected lcore 17 as core 17 on socket 0 00:06:38.155 EAL: Detected lcore 18 as core 18 on socket 0 00:06:38.155 EAL: Detected lcore 19 as core 19 on socket 0 00:06:38.155 EAL: Detected lcore 20 as core 20 on socket 0 00:06:38.155 EAL: Detected lcore 21 as core 21 on socket 0 00:06:38.155 EAL: Detected lcore 22 as core 22 on socket 0 00:06:38.155 EAL: Detected lcore 23 as core 23 on socket 0 00:06:38.155 EAL: Detected lcore 24 as core 24 on socket 0 00:06:38.155 EAL: Detected lcore 25 as core 25 on socket 0 00:06:38.155 EAL: Detected lcore 26 as core 26 on socket 0 00:06:38.155 EAL: Detected lcore 27 as core 27 on socket 0 00:06:38.155 EAL: Detected lcore 28 as core 28 on socket 0 00:06:38.155 EAL: Detected lcore 29 as core 29 on socket 0 00:06:38.155 EAL: Detected lcore 30 as core 30 on socket 0 00:06:38.155 EAL: Detected lcore 31 as core 31 on socket 0 00:06:38.155 EAL: Detected lcore 32 as core 0 on socket 1 00:06:38.155 EAL: Detected lcore 33 as core 1 on socket 1 00:06:38.155 EAL: Detected lcore 34 as core 2 on socket 1 00:06:38.155 EAL: Detected lcore 35 as core 3 on socket 1 00:06:38.155 EAL: Detected lcore 36 as core 4 on socket 1 00:06:38.155 EAL: Detected lcore 37 as core 5 on socket 1 00:06:38.155 EAL: Detected lcore 38 as core 6 on socket 1 00:06:38.155 EAL: Detected lcore 39 as core 7 on socket 1 00:06:38.155 EAL: Detected lcore 40 as core 8 on socket 1 00:06:38.155 EAL: Detected lcore 41 as core 9 on socket 1 00:06:38.155 EAL: Detected lcore 42 as core 10 on socket 1 00:06:38.155 EAL: Detected lcore 43 as core 11 on socket 1 00:06:38.155 EAL: Detected lcore 44 as core 12 on socket 1 00:06:38.155 EAL: Detected lcore 45 as core 13 on socket 1 00:06:38.155 EAL: Detected lcore 46 as core 14 on socket 1 00:06:38.155 EAL: Detected lcore 47 as core 15 on socket 1 00:06:38.155 EAL: Detected lcore 48 as core 16 on socket 1 00:06:38.155 EAL: Detected lcore 49 as core 17 on socket 1 00:06:38.155 EAL: Detected lcore 50 as core 18 on socket 1 00:06:38.155 EAL: Detected lcore 51 as core 19 on socket 1 00:06:38.155 EAL: Detected lcore 52 as core 20 on socket 1 00:06:38.155 EAL: Detected lcore 53 as core 21 on socket 1 00:06:38.155 EAL: Detected lcore 54 as core 22 on socket 1 00:06:38.155 EAL: Detected lcore 55 as core 23 on socket 1 00:06:38.155 EAL: Detected lcore 56 as core 24 on socket 1 00:06:38.155 EAL: Detected lcore 57 as core 25 on socket 1 00:06:38.155 EAL: Detected lcore 58 as core 26 on socket 1 00:06:38.155 EAL: Detected lcore 59 as core 27 on socket 1 00:06:38.155 EAL: Detected lcore 60 as core 28 on socket 1 00:06:38.155 EAL: Detected lcore 61 as core 29 on socket 1 00:06:38.155 EAL: Detected lcore 62 as core 30 on socket 1 00:06:38.155 EAL: Detected lcore 63 as core 31 on socket 1 00:06:38.155 EAL: Detected lcore 64 as core 0 on socket 0 00:06:38.155 EAL: Detected lcore 65 as core 1 on socket 0 00:06:38.155 EAL: Detected lcore 66 as core 2 on socket 0 00:06:38.155 EAL: Detected lcore 67 as core 3 on socket 0 00:06:38.155 EAL: Detected lcore 68 as core 4 on socket 0 00:06:38.155 EAL: Detected lcore 69 as core 5 on socket 0 00:06:38.155 EAL: Detected lcore 70 as core 6 on socket 0 00:06:38.155 EAL: Detected lcore 71 as core 7 on socket 0 00:06:38.155 EAL: Detected lcore 72 as core 8 on socket 0 00:06:38.155 EAL: Detected lcore 73 as core 9 on socket 0 00:06:38.155 EAL: Detected lcore 74 as core 10 on socket 0 00:06:38.155 EAL: Detected lcore 75 as core 11 on socket 0 00:06:38.155 EAL: Detected lcore 76 as core 12 on socket 0 00:06:38.155 EAL: Detected lcore 77 as core 13 on socket 0 00:06:38.155 EAL: Detected lcore 78 as core 14 on socket 0 00:06:38.155 EAL: Detected lcore 79 as core 15 on socket 0 00:06:38.155 EAL: Detected lcore 80 as core 16 on socket 0 00:06:38.155 EAL: Detected lcore 81 as core 17 on socket 0 00:06:38.155 EAL: Detected lcore 82 as core 18 on socket 0 00:06:38.155 EAL: Detected lcore 83 as core 19 on socket 0 00:06:38.155 EAL: Detected lcore 84 as core 20 on socket 0 00:06:38.155 EAL: Detected lcore 85 as core 21 on socket 0 00:06:38.155 EAL: Detected lcore 86 as core 22 on socket 0 00:06:38.155 EAL: Detected lcore 87 as core 23 on socket 0 00:06:38.155 EAL: Detected lcore 88 as core 24 on socket 0 00:06:38.155 EAL: Detected lcore 89 as core 25 on socket 0 00:06:38.155 EAL: Detected lcore 90 as core 26 on socket 0 00:06:38.155 EAL: Detected lcore 91 as core 27 on socket 0 00:06:38.155 EAL: Detected lcore 92 as core 28 on socket 0 00:06:38.155 EAL: Detected lcore 93 as core 29 on socket 0 00:06:38.155 EAL: Detected lcore 94 as core 30 on socket 0 00:06:38.155 EAL: Detected lcore 95 as core 31 on socket 0 00:06:38.155 EAL: Detected lcore 96 as core 0 on socket 1 00:06:38.155 EAL: Detected lcore 97 as core 1 on socket 1 00:06:38.155 EAL: Detected lcore 98 as core 2 on socket 1 00:06:38.155 EAL: Detected lcore 99 as core 3 on socket 1 00:06:38.155 EAL: Detected lcore 100 as core 4 on socket 1 00:06:38.155 EAL: Detected lcore 101 as core 5 on socket 1 00:06:38.155 EAL: Detected lcore 102 as core 6 on socket 1 00:06:38.155 EAL: Detected lcore 103 as core 7 on socket 1 00:06:38.155 EAL: Detected lcore 104 as core 8 on socket 1 00:06:38.155 EAL: Detected lcore 105 as core 9 on socket 1 00:06:38.155 EAL: Detected lcore 106 as core 10 on socket 1 00:06:38.155 EAL: Detected lcore 107 as core 11 on socket 1 00:06:38.155 EAL: Detected lcore 108 as core 12 on socket 1 00:06:38.155 EAL: Detected lcore 109 as core 13 on socket 1 00:06:38.155 EAL: Detected lcore 110 as core 14 on socket 1 00:06:38.155 EAL: Detected lcore 111 as core 15 on socket 1 00:06:38.155 EAL: Detected lcore 112 as core 16 on socket 1 00:06:38.155 EAL: Detected lcore 113 as core 17 on socket 1 00:06:38.155 EAL: Detected lcore 114 as core 18 on socket 1 00:06:38.155 EAL: Detected lcore 115 as core 19 on socket 1 00:06:38.155 EAL: Detected lcore 116 as core 20 on socket 1 00:06:38.155 EAL: Detected lcore 117 as core 21 on socket 1 00:06:38.155 EAL: Detected lcore 118 as core 22 on socket 1 00:06:38.155 EAL: Detected lcore 119 as core 23 on socket 1 00:06:38.155 EAL: Detected lcore 120 as core 24 on socket 1 00:06:38.155 EAL: Detected lcore 121 as core 25 on socket 1 00:06:38.155 EAL: Detected lcore 122 as core 26 on socket 1 00:06:38.155 EAL: Detected lcore 123 as core 27 on socket 1 00:06:38.155 EAL: Detected lcore 124 as core 28 on socket 1 00:06:38.155 EAL: Detected lcore 125 as core 29 on socket 1 00:06:38.155 EAL: Detected lcore 126 as core 30 on socket 1 00:06:38.155 EAL: Detected lcore 127 as core 31 on socket 1 00:06:38.155 EAL: Maximum logical cores by configuration: 128 00:06:38.155 EAL: Detected CPU lcores: 128 00:06:38.155 EAL: Detected NUMA nodes: 2 00:06:38.155 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:06:38.155 EAL: Detected shared linkage of DPDK 00:06:38.155 EAL: No shared files mode enabled, IPC will be disabled 00:06:38.417 EAL: No shared files mode enabled, IPC is disabled 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.0 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.1 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.2 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.3 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.4 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.5 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.6 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:01.7 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.0 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.1 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.2 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.3 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.4 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.5 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.6 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:cc:02.7 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.0 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.1 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.2 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.3 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.4 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.5 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.6 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:01.7 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.0 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.1 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.2 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.3 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.4 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.5 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.6 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:ce:02.7 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.0 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.1 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.2 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.3 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.4 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.5 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.6 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:01.7 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.0 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.1 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.2 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.3 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.4 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.5 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.6 wants IOVA as 'PA' 00:06:38.417 EAL: PCI driver qat for device 0000:d0:02.7 wants IOVA as 'PA' 00:06:38.417 EAL: Bus pci wants IOVA as 'PA' 00:06:38.417 EAL: Bus auxiliary wants IOVA as 'DC' 00:06:38.417 EAL: Bus vdev wants IOVA as 'DC' 00:06:38.417 EAL: Selected IOVA mode 'PA' 00:06:38.417 EAL: Probing VFIO support... 00:06:38.417 EAL: IOMMU type 1 (Type 1) is supported 00:06:38.417 EAL: IOMMU type 7 (sPAPR) is not supported 00:06:38.417 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:06:38.417 EAL: VFIO support initialized 00:06:38.417 EAL: Ask a virtual area of 0x2e000 bytes 00:06:38.417 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:06:38.417 EAL: Setting up physically contiguous memory... 00:06:38.417 EAL: Setting maximum number of open files to 524288 00:06:38.417 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:06:38.417 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:06:38.417 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:06:38.417 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.417 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:06:38.417 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.417 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.417 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:06:38.417 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:06:38.418 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.418 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:06:38.418 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.418 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.418 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:06:38.418 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:06:38.418 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.418 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:06:38.418 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.418 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.418 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:06:38.418 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:06:38.418 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.418 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:06:38.418 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:06:38.418 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.418 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:06:38.418 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:06:38.418 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:06:38.418 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.418 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:06:38.418 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.418 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.418 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:06:38.418 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:06:38.418 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.418 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:06:38.418 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.418 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.418 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:06:38.418 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:06:38.418 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.418 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:06:38.418 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.418 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.418 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:06:38.418 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:06:38.418 EAL: Ask a virtual area of 0x61000 bytes 00:06:38.418 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:06:38.418 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:06:38.418 EAL: Ask a virtual area of 0x400000000 bytes 00:06:38.418 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:06:38.418 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:06:38.418 EAL: Hugepages will be freed exactly as allocated. 00:06:38.418 EAL: No shared files mode enabled, IPC is disabled 00:06:38.418 EAL: No shared files mode enabled, IPC is disabled 00:06:38.418 EAL: TSC frequency is ~2600000 KHz 00:06:38.418 EAL: Main lcore 0 is ready (tid=7effac17cb00;cpuset=[0]) 00:06:38.418 EAL: Trying to obtain current memory policy. 00:06:38.418 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.418 EAL: Restoring previous memory policy: 0 00:06:38.418 EAL: request: mp_malloc_sync 00:06:38.418 EAL: No shared files mode enabled, IPC is disabled 00:06:38.418 EAL: Heap on socket 0 was expanded by 2MB 00:06:38.418 EAL: PCI device 0000:cc:01.0 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001000000 00:06:38.418 EAL: PCI memory mapped at 0x202001001000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:38.418 EAL: Trying to obtain current memory policy. 00:06:38.418 EAL: Setting policy MPOL_PREFERRED for socket 1 00:06:38.418 EAL: Restoring previous memory policy: 4 00:06:38.418 EAL: request: mp_malloc_sync 00:06:38.418 EAL: No shared files mode enabled, IPC is disabled 00:06:38.418 EAL: Heap on socket 1 was expanded by 2MB 00:06:38.418 EAL: PCI device 0000:cc:01.1 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001002000 00:06:38.418 EAL: PCI memory mapped at 0x202001003000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:01.2 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001004000 00:06:38.418 EAL: PCI memory mapped at 0x202001005000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:01.3 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001006000 00:06:38.418 EAL: PCI memory mapped at 0x202001007000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:01.4 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001008000 00:06:38.418 EAL: PCI memory mapped at 0x202001009000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:01.5 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x20200100a000 00:06:38.418 EAL: PCI memory mapped at 0x20200100b000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:01.6 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x20200100c000 00:06:38.418 EAL: PCI memory mapped at 0x20200100d000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:01.7 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x20200100e000 00:06:38.418 EAL: PCI memory mapped at 0x20200100f000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.0 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001010000 00:06:38.418 EAL: PCI memory mapped at 0x202001011000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.1 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001012000 00:06:38.418 EAL: PCI memory mapped at 0x202001013000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.2 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001014000 00:06:38.418 EAL: PCI memory mapped at 0x202001015000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.3 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001016000 00:06:38.418 EAL: PCI memory mapped at 0x202001017000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.4 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001018000 00:06:38.418 EAL: PCI memory mapped at 0x202001019000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.5 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x20200101a000 00:06:38.418 EAL: PCI memory mapped at 0x20200101b000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.6 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x20200101c000 00:06:38.418 EAL: PCI memory mapped at 0x20200101d000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:38.418 EAL: PCI device 0000:cc:02.7 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x20200101e000 00:06:38.418 EAL: PCI memory mapped at 0x20200101f000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:38.418 EAL: PCI device 0000:ce:01.0 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001020000 00:06:38.418 EAL: PCI memory mapped at 0x202001021000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:38.418 EAL: PCI device 0000:ce:01.1 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001022000 00:06:38.418 EAL: PCI memory mapped at 0x202001023000 00:06:38.418 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:38.418 EAL: PCI device 0000:ce:01.2 on NUMA socket 1 00:06:38.418 EAL: probe driver: 8086:37c9 qat 00:06:38.418 EAL: PCI memory mapped at 0x202001024000 00:06:38.419 EAL: PCI memory mapped at 0x202001025000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:01.3 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001026000 00:06:38.419 EAL: PCI memory mapped at 0x202001027000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:01.4 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001028000 00:06:38.419 EAL: PCI memory mapped at 0x202001029000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:01.5 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200102a000 00:06:38.419 EAL: PCI memory mapped at 0x20200102b000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:01.6 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200102c000 00:06:38.419 EAL: PCI memory mapped at 0x20200102d000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:01.7 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200102e000 00:06:38.419 EAL: PCI memory mapped at 0x20200102f000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.0 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001030000 00:06:38.419 EAL: PCI memory mapped at 0x202001031000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.1 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001032000 00:06:38.419 EAL: PCI memory mapped at 0x202001033000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.2 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001034000 00:06:38.419 EAL: PCI memory mapped at 0x202001035000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.3 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001036000 00:06:38.419 EAL: PCI memory mapped at 0x202001037000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.4 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001038000 00:06:38.419 EAL: PCI memory mapped at 0x202001039000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.5 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200103a000 00:06:38.419 EAL: PCI memory mapped at 0x20200103b000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.6 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200103c000 00:06:38.419 EAL: PCI memory mapped at 0x20200103d000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:38.419 EAL: PCI device 0000:ce:02.7 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200103e000 00:06:38.419 EAL: PCI memory mapped at 0x20200103f000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.0 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001040000 00:06:38.419 EAL: PCI memory mapped at 0x202001041000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.1 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001042000 00:06:38.419 EAL: PCI memory mapped at 0x202001043000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.2 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001044000 00:06:38.419 EAL: PCI memory mapped at 0x202001045000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.3 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001046000 00:06:38.419 EAL: PCI memory mapped at 0x202001047000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.4 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001048000 00:06:38.419 EAL: PCI memory mapped at 0x202001049000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.5 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200104a000 00:06:38.419 EAL: PCI memory mapped at 0x20200104b000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.6 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200104c000 00:06:38.419 EAL: PCI memory mapped at 0x20200104d000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:01.7 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200104e000 00:06:38.419 EAL: PCI memory mapped at 0x20200104f000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.0 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001050000 00:06:38.419 EAL: PCI memory mapped at 0x202001051000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.1 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001052000 00:06:38.419 EAL: PCI memory mapped at 0x202001053000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.2 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001054000 00:06:38.419 EAL: PCI memory mapped at 0x202001055000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.3 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001056000 00:06:38.419 EAL: PCI memory mapped at 0x202001057000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.4 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x202001058000 00:06:38.419 EAL: PCI memory mapped at 0x202001059000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.5 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200105a000 00:06:38.419 EAL: PCI memory mapped at 0x20200105b000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.6 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200105c000 00:06:38.419 EAL: PCI memory mapped at 0x20200105d000 00:06:38.419 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:38.419 EAL: PCI device 0000:d0:02.7 on NUMA socket 1 00:06:38.419 EAL: probe driver: 8086:37c9 qat 00:06:38.419 EAL: PCI memory mapped at 0x20200105e000 00:06:38.419 EAL: PCI memory mapped at 0x20200105f000 00:06:38.420 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: No PCI address specified using 'addr=' in: bus=pci 00:06:38.420 EAL: Mem event callback 'spdk:(nil)' registered 00:06:38.420 00:06:38.420 00:06:38.420 CUnit - A unit testing framework for C - Version 2.1-3 00:06:38.420 http://cunit.sourceforge.net/ 00:06:38.420 00:06:38.420 00:06:38.420 Suite: components_suite 00:06:38.420 Test: vtophys_malloc_test ...passed 00:06:38.420 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 4MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 4MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 6MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 6MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 10MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 10MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 18MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 18MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 34MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 34MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 66MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 66MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 130MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 130MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.420 EAL: Restoring previous memory policy: 4 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was expanded by 258MB 00:06:38.420 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.420 EAL: request: mp_malloc_sync 00:06:38.420 EAL: No shared files mode enabled, IPC is disabled 00:06:38.420 EAL: Heap on socket 0 was shrunk by 258MB 00:06:38.420 EAL: Trying to obtain current memory policy. 00:06:38.420 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.680 EAL: Restoring previous memory policy: 4 00:06:38.680 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.680 EAL: request: mp_malloc_sync 00:06:38.680 EAL: No shared files mode enabled, IPC is disabled 00:06:38.680 EAL: Heap on socket 0 was expanded by 514MB 00:06:38.680 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.680 EAL: request: mp_malloc_sync 00:06:38.680 EAL: No shared files mode enabled, IPC is disabled 00:06:38.680 EAL: Heap on socket 0 was shrunk by 514MB 00:06:38.680 EAL: Trying to obtain current memory policy. 00:06:38.680 EAL: Setting policy MPOL_PREFERRED for socket 0 00:06:38.940 EAL: Restoring previous memory policy: 4 00:06:38.940 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.940 EAL: request: mp_malloc_sync 00:06:38.940 EAL: No shared files mode enabled, IPC is disabled 00:06:38.940 EAL: Heap on socket 0 was expanded by 1026MB 00:06:38.941 EAL: Calling mem event callback 'spdk:(nil)' 00:06:38.941 EAL: request: mp_malloc_sync 00:06:38.941 EAL: No shared files mode enabled, IPC is disabled 00:06:38.941 EAL: Heap on socket 0 was shrunk by 1026MB 00:06:38.941 passed 00:06:38.941 00:06:38.941 Run Summary: Type Total Ran Passed Failed Inactive 00:06:38.941 suites 1 1 n/a 0 0 00:06:38.941 tests 2 2 2 0 0 00:06:38.941 asserts 6674 6674 6674 0 n/a 00:06:38.941 00:06:38.941 Elapsed time = 0.677 seconds 00:06:38.941 EAL: No shared files mode enabled, IPC is disabled 00:06:38.941 EAL: No shared files mode enabled, IPC is disabled 00:06:38.941 EAL: No shared files mode enabled, IPC is disabled 00:06:38.941 00:06:38.941 real 0m0.837s 00:06:38.941 user 0m0.428s 00:06:38.941 sys 0m0.380s 00:06:38.941 13:15:19 env.env_vtophys -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:38.941 13:15:19 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:06:38.941 ************************************ 00:06:38.941 END TEST env_vtophys 00:06:38.941 ************************************ 00:06:39.201 13:15:19 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:39.201 13:15:19 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:39.201 13:15:19 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:39.201 13:15:19 env -- common/autotest_common.sh@10 -- # set +x 00:06:39.201 ************************************ 00:06:39.201 START TEST env_pci 00:06:39.201 ************************************ 00:06:39.201 13:15:19 env.env_pci -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:06:39.201 00:06:39.201 00:06:39.201 CUnit - A unit testing framework for C - Version 2.1-3 00:06:39.201 http://cunit.sourceforge.net/ 00:06:39.201 00:06:39.201 00:06:39.201 Suite: pci 00:06:39.201 Test: pci_hook ...[2024-07-25 13:15:19.820933] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 820292 has claimed it 00:06:39.201 EAL: Cannot find device (10000:00:01.0) 00:06:39.201 EAL: Failed to attach device on primary process 00:06:39.201 passed 00:06:39.201 00:06:39.201 Run Summary: Type Total Ran Passed Failed Inactive 00:06:39.201 suites 1 1 n/a 0 0 00:06:39.201 tests 1 1 1 0 0 00:06:39.201 asserts 25 25 25 0 n/a 00:06:39.201 00:06:39.201 Elapsed time = 0.032 seconds 00:06:39.201 00:06:39.201 real 0m0.060s 00:06:39.201 user 0m0.019s 00:06:39.201 sys 0m0.040s 00:06:39.201 13:15:19 env.env_pci -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:39.201 13:15:19 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:06:39.201 ************************************ 00:06:39.201 END TEST env_pci 00:06:39.201 ************************************ 00:06:39.201 13:15:19 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:06:39.201 13:15:19 env -- env/env.sh@15 -- # uname 00:06:39.201 13:15:19 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:06:39.201 13:15:19 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:06:39.201 13:15:19 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:39.201 13:15:19 env -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:06:39.201 13:15:19 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:39.201 13:15:19 env -- common/autotest_common.sh@10 -- # set +x 00:06:39.201 ************************************ 00:06:39.201 START TEST env_dpdk_post_init 00:06:39.201 ************************************ 00:06:39.201 13:15:19 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:06:39.201 EAL: Detected CPU lcores: 128 00:06:39.201 EAL: Detected NUMA nodes: 2 00:06:39.201 EAL: Detected shared linkage of DPDK 00:06:39.202 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:39.463 EAL: Selected IOVA mode 'PA' 00:06:39.463 EAL: VFIO support initialized 00:06:39.463 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.463 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.463 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.463 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.463 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.463 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:06:39.463 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.464 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:06:39.464 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.464 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:39.465 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:06:39.465 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:39.465 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:39.465 EAL: Using IOMMU type 1 (Type 1) 00:06:39.465 EAL: Ignore mapping IO port bar(1) 00:06:39.738 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.0 (socket 0) 00:06:39.738 EAL: Ignore mapping IO port bar(1) 00:06:39.998 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.1 (socket 0) 00:06:39.998 EAL: Ignore mapping IO port bar(1) 00:06:39.998 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.2 (socket 0) 00:06:40.258 EAL: Ignore mapping IO port bar(1) 00:06:40.258 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.3 (socket 0) 00:06:40.518 EAL: Ignore mapping IO port bar(1) 00:06:40.519 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.4 (socket 0) 00:06:40.779 EAL: Ignore mapping IO port bar(1) 00:06:40.779 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.5 (socket 0) 00:06:40.779 EAL: Ignore mapping IO port bar(1) 00:06:41.038 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.6 (socket 0) 00:06:41.038 EAL: Ignore mapping IO port bar(1) 00:06:41.299 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:00:01.7 (socket 0) 00:06:41.869 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:65:00.0 (socket 0) 00:06:42.129 EAL: Ignore mapping IO port bar(1) 00:06:42.129 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.0 (socket 1) 00:06:42.389 EAL: Ignore mapping IO port bar(1) 00:06:42.389 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.1 (socket 1) 00:06:42.648 EAL: Ignore mapping IO port bar(1) 00:06:42.648 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.2 (socket 1) 00:06:42.648 EAL: Ignore mapping IO port bar(1) 00:06:42.908 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.3 (socket 1) 00:06:42.908 EAL: Ignore mapping IO port bar(1) 00:06:43.168 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.4 (socket 1) 00:06:43.168 EAL: Ignore mapping IO port bar(1) 00:06:43.428 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.5 (socket 1) 00:06:43.428 EAL: Ignore mapping IO port bar(1) 00:06:43.428 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.6 (socket 1) 00:06:43.688 EAL: Ignore mapping IO port bar(1) 00:06:43.688 EAL: Probe PCI driver: spdk_ioat (8086:0b00) device: 0000:80:01.7 (socket 1) 00:06:47.900 EAL: Releasing PCI mapped resource for 0000:65:00.0 00:06:47.900 EAL: Calling pci_unmap_resource for 0000:65:00.0 at 0x202001080000 00:06:47.900 Starting DPDK initialization... 00:06:47.900 Starting SPDK post initialization... 00:06:47.900 SPDK NVMe probe 00:06:47.900 Attaching to 0000:65:00.0 00:06:47.900 Attached to 0000:65:00.0 00:06:47.900 Cleaning up... 00:06:49.810 00:06:49.810 real 0m10.405s 00:06:49.810 user 0m4.261s 00:06:49.810 sys 0m0.171s 00:06:49.810 13:15:30 env.env_dpdk_post_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.810 13:15:30 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:06:49.810 ************************************ 00:06:49.810 END TEST env_dpdk_post_init 00:06:49.810 ************************************ 00:06:49.810 13:15:30 env -- env/env.sh@26 -- # uname 00:06:49.810 13:15:30 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:06:49.810 13:15:30 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:49.810 13:15:30 env -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.810 13:15:30 env -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.810 13:15:30 env -- common/autotest_common.sh@10 -- # set +x 00:06:49.810 ************************************ 00:06:49.810 START TEST env_mem_callbacks 00:06:49.810 ************************************ 00:06:49.810 13:15:30 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:06:49.810 EAL: Detected CPU lcores: 128 00:06:49.810 EAL: Detected NUMA nodes: 2 00:06:49.810 EAL: Detected shared linkage of DPDK 00:06:49.810 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:06:49.810 EAL: Selected IOVA mode 'PA' 00:06:49.810 EAL: VFIO support initialized 00:06:49.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.0 (socket 1) 00:06:49.810 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_asym 00:06:49.810 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.810 CRYPTODEV: Creating cryptodev 0000:cc:01.0_qat_sym 00:06:49.810 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.1 (socket 1) 00:06:49.810 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_asym 00:06:49.810 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.810 CRYPTODEV: Creating cryptodev 0000:cc:01.1_qat_sym 00:06:49.810 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.810 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.2 (socket 1) 00:06:49.810 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_asym 00:06:49.810 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.2_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.3 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.3_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.4 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.4_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.5 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.5_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.6 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.6_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:01.7 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:01.7_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.0 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.0_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.1 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.1_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.2 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.2_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.3 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.3_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.4 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.4_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.5 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.5_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.6 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.6_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:cc:02.7 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:cc:02.7_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:cc:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.0 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.0_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.1 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.1_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.2 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.2_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.3 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.3_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.4 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.4_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.5 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.5_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.6 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.6_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:01.7 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:01.7_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.0 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.0_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.1 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.1_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.2 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.2_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.3 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.3_qat_sym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.811 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.4 (socket 1) 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_asym 00:06:49.811 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.811 CRYPTODEV: Creating cryptodev 0000:ce:02.4_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.5 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:ce:02.5_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.6 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:ce:02.6_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:ce:02.7 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:ce:02.7_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:ce:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.0 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.0_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.1 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.1_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.2 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.2_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.3 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.3_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.4 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.4_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.5 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.5_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.6 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.6_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:01.7 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:01.7_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:01.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.0 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.0_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.0_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.1 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.1_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.1_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.2 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.2_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.2_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.3 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.3_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.3_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.4 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.4_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.4_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.5 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.5_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.5_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.6 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.6_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.6_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:d0:02.7 (socket 1) 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_asym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_asym,socket id: 1, max queue pairs: 0 00:06:49.812 CRYPTODEV: Creating cryptodev 0000:d0:02.7_qat_sym 00:06:49.812 CRYPTODEV: Initialisation parameters - name: 0000:d0:02.7_qat_sym,socket id: 1, max queue pairs: 0 00:06:49.812 TELEMETRY: No legacy callbacks, legacy socket not created 00:06:49.812 00:06:49.812 00:06:49.812 CUnit - A unit testing framework for C - Version 2.1-3 00:06:49.812 http://cunit.sourceforge.net/ 00:06:49.812 00:06:49.812 00:06:49.812 Suite: memory 00:06:49.812 Test: test ... 00:06:49.812 register 0x200000200000 2097152 00:06:49.812 register 0x201000a00000 2097152 00:06:49.812 malloc 3145728 00:06:49.812 register 0x200000400000 4194304 00:06:49.812 buf 0x200000500000 len 3145728 PASSED 00:06:49.812 malloc 64 00:06:49.812 buf 0x2000004fff40 len 64 PASSED 00:06:49.812 malloc 4194304 00:06:49.812 register 0x200000800000 6291456 00:06:49.812 buf 0x200000a00000 len 4194304 PASSED 00:06:49.812 free 0x200000500000 3145728 00:06:49.812 free 0x2000004fff40 64 00:06:49.812 unregister 0x200000400000 4194304 PASSED 00:06:49.812 free 0x200000a00000 4194304 00:06:49.812 unregister 0x200000800000 6291456 PASSED 00:06:49.812 malloc 8388608 00:06:49.812 register 0x200000400000 10485760 00:06:49.812 buf 0x200000600000 len 8388608 PASSED 00:06:49.812 free 0x200000600000 8388608 00:06:49.812 unregister 0x200000400000 10485760 PASSED 00:06:49.812 passed 00:06:49.812 00:06:49.812 Run Summary: Type Total Ran Passed Failed Inactive 00:06:49.812 suites 1 1 n/a 0 0 00:06:49.812 tests 1 1 1 0 0 00:06:49.812 asserts 16 16 16 0 n/a 00:06:49.812 00:06:49.812 Elapsed time = 0.009 seconds 00:06:49.812 00:06:49.812 real 0m0.088s 00:06:49.812 user 0m0.033s 00:06:49.812 sys 0m0.055s 00:06:49.812 13:15:30 env.env_mem_callbacks -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.812 13:15:30 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:06:49.812 ************************************ 00:06:49.812 END TEST env_mem_callbacks 00:06:49.812 ************************************ 00:06:49.812 00:06:49.812 real 0m12.087s 00:06:49.812 user 0m5.112s 00:06:49.812 sys 0m0.998s 00:06:49.813 13:15:30 env -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:49.813 13:15:30 env -- common/autotest_common.sh@10 -- # set +x 00:06:49.813 ************************************ 00:06:49.813 END TEST env 00:06:49.813 ************************************ 00:06:49.813 13:15:30 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:49.813 13:15:30 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:49.813 13:15:30 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:49.813 13:15:30 -- common/autotest_common.sh@10 -- # set +x 00:06:50.074 ************************************ 00:06:50.074 START TEST rpc 00:06:50.074 ************************************ 00:06:50.074 13:15:30 rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:06:50.074 * Looking for test storage... 00:06:50.074 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:50.074 13:15:30 rpc -- rpc/rpc.sh@65 -- # spdk_pid=822234 00:06:50.074 13:15:30 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:50.074 13:15:30 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:06:50.074 13:15:30 rpc -- rpc/rpc.sh@67 -- # waitforlisten 822234 00:06:50.074 13:15:30 rpc -- common/autotest_common.sh@831 -- # '[' -z 822234 ']' 00:06:50.074 13:15:30 rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:50.074 13:15:30 rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:50.074 13:15:30 rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:50.074 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:50.074 13:15:30 rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:50.074 13:15:30 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:50.074 [2024-07-25 13:15:30.847376] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:06:50.074 [2024-07-25 13:15:30.847509] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid822234 ] 00:06:50.333 [2024-07-25 13:15:30.992742] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:50.333 [2024-07-25 13:15:31.070615] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:06:50.333 [2024-07-25 13:15:31.070656] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 822234' to capture a snapshot of events at runtime. 00:06:50.333 [2024-07-25 13:15:31.070663] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:06:50.333 [2024-07-25 13:15:31.070669] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:06:50.333 [2024-07-25 13:15:31.070675] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid822234 for offline analysis/debug. 00:06:50.333 [2024-07-25 13:15:31.070694] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.269 13:15:31 rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:51.269 13:15:31 rpc -- common/autotest_common.sh@864 -- # return 0 00:06:51.269 13:15:31 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:51.269 13:15:31 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:51.269 13:15:31 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:06:51.269 13:15:31 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:06:51.269 13:15:31 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.269 13:15:31 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.269 13:15:31 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:51.269 ************************************ 00:06:51.269 START TEST rpc_integrity 00:06:51.269 ************************************ 00:06:51.269 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:51.269 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:51.269 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.269 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.269 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.269 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:51.269 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:51.269 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:51.269 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:51.269 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.269 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:51.270 { 00:06:51.270 "name": "Malloc0", 00:06:51.270 "aliases": [ 00:06:51.270 "80d085fe-8759-43d7-9d8e-eafdf2ad50fc" 00:06:51.270 ], 00:06:51.270 "product_name": "Malloc disk", 00:06:51.270 "block_size": 512, 00:06:51.270 "num_blocks": 16384, 00:06:51.270 "uuid": "80d085fe-8759-43d7-9d8e-eafdf2ad50fc", 00:06:51.270 "assigned_rate_limits": { 00:06:51.270 "rw_ios_per_sec": 0, 00:06:51.270 "rw_mbytes_per_sec": 0, 00:06:51.270 "r_mbytes_per_sec": 0, 00:06:51.270 "w_mbytes_per_sec": 0 00:06:51.270 }, 00:06:51.270 "claimed": false, 00:06:51.270 "zoned": false, 00:06:51.270 "supported_io_types": { 00:06:51.270 "read": true, 00:06:51.270 "write": true, 00:06:51.270 "unmap": true, 00:06:51.270 "flush": true, 00:06:51.270 "reset": true, 00:06:51.270 "nvme_admin": false, 00:06:51.270 "nvme_io": false, 00:06:51.270 "nvme_io_md": false, 00:06:51.270 "write_zeroes": true, 00:06:51.270 "zcopy": true, 00:06:51.270 "get_zone_info": false, 00:06:51.270 "zone_management": false, 00:06:51.270 "zone_append": false, 00:06:51.270 "compare": false, 00:06:51.270 "compare_and_write": false, 00:06:51.270 "abort": true, 00:06:51.270 "seek_hole": false, 00:06:51.270 "seek_data": false, 00:06:51.270 "copy": true, 00:06:51.270 "nvme_iov_md": false 00:06:51.270 }, 00:06:51.270 "memory_domains": [ 00:06:51.270 { 00:06:51.270 "dma_device_id": "system", 00:06:51.270 "dma_device_type": 1 00:06:51.270 }, 00:06:51.270 { 00:06:51.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:51.270 "dma_device_type": 2 00:06:51.270 } 00:06:51.270 ], 00:06:51.270 "driver_specific": {} 00:06:51.270 } 00:06:51.270 ]' 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 [2024-07-25 13:15:31.868527] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:06:51.270 [2024-07-25 13:15:31.868562] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:51.270 [2024-07-25 13:15:31.868573] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2575d50 00:06:51.270 [2024-07-25 13:15:31.868580] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:51.270 [2024-07-25 13:15:31.869851] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:51.270 [2024-07-25 13:15:31.869871] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:51.270 Passthru0 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:51.270 { 00:06:51.270 "name": "Malloc0", 00:06:51.270 "aliases": [ 00:06:51.270 "80d085fe-8759-43d7-9d8e-eafdf2ad50fc" 00:06:51.270 ], 00:06:51.270 "product_name": "Malloc disk", 00:06:51.270 "block_size": 512, 00:06:51.270 "num_blocks": 16384, 00:06:51.270 "uuid": "80d085fe-8759-43d7-9d8e-eafdf2ad50fc", 00:06:51.270 "assigned_rate_limits": { 00:06:51.270 "rw_ios_per_sec": 0, 00:06:51.270 "rw_mbytes_per_sec": 0, 00:06:51.270 "r_mbytes_per_sec": 0, 00:06:51.270 "w_mbytes_per_sec": 0 00:06:51.270 }, 00:06:51.270 "claimed": true, 00:06:51.270 "claim_type": "exclusive_write", 00:06:51.270 "zoned": false, 00:06:51.270 "supported_io_types": { 00:06:51.270 "read": true, 00:06:51.270 "write": true, 00:06:51.270 "unmap": true, 00:06:51.270 "flush": true, 00:06:51.270 "reset": true, 00:06:51.270 "nvme_admin": false, 00:06:51.270 "nvme_io": false, 00:06:51.270 "nvme_io_md": false, 00:06:51.270 "write_zeroes": true, 00:06:51.270 "zcopy": true, 00:06:51.270 "get_zone_info": false, 00:06:51.270 "zone_management": false, 00:06:51.270 "zone_append": false, 00:06:51.270 "compare": false, 00:06:51.270 "compare_and_write": false, 00:06:51.270 "abort": true, 00:06:51.270 "seek_hole": false, 00:06:51.270 "seek_data": false, 00:06:51.270 "copy": true, 00:06:51.270 "nvme_iov_md": false 00:06:51.270 }, 00:06:51.270 "memory_domains": [ 00:06:51.270 { 00:06:51.270 "dma_device_id": "system", 00:06:51.270 "dma_device_type": 1 00:06:51.270 }, 00:06:51.270 { 00:06:51.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:51.270 "dma_device_type": 2 00:06:51.270 } 00:06:51.270 ], 00:06:51.270 "driver_specific": {} 00:06:51.270 }, 00:06:51.270 { 00:06:51.270 "name": "Passthru0", 00:06:51.270 "aliases": [ 00:06:51.270 "7438a78c-555b-51e4-8153-a2f5cd6b4d92" 00:06:51.270 ], 00:06:51.270 "product_name": "passthru", 00:06:51.270 "block_size": 512, 00:06:51.270 "num_blocks": 16384, 00:06:51.270 "uuid": "7438a78c-555b-51e4-8153-a2f5cd6b4d92", 00:06:51.270 "assigned_rate_limits": { 00:06:51.270 "rw_ios_per_sec": 0, 00:06:51.270 "rw_mbytes_per_sec": 0, 00:06:51.270 "r_mbytes_per_sec": 0, 00:06:51.270 "w_mbytes_per_sec": 0 00:06:51.270 }, 00:06:51.270 "claimed": false, 00:06:51.270 "zoned": false, 00:06:51.270 "supported_io_types": { 00:06:51.270 "read": true, 00:06:51.270 "write": true, 00:06:51.270 "unmap": true, 00:06:51.270 "flush": true, 00:06:51.270 "reset": true, 00:06:51.270 "nvme_admin": false, 00:06:51.270 "nvme_io": false, 00:06:51.270 "nvme_io_md": false, 00:06:51.270 "write_zeroes": true, 00:06:51.270 "zcopy": true, 00:06:51.270 "get_zone_info": false, 00:06:51.270 "zone_management": false, 00:06:51.270 "zone_append": false, 00:06:51.270 "compare": false, 00:06:51.270 "compare_and_write": false, 00:06:51.270 "abort": true, 00:06:51.270 "seek_hole": false, 00:06:51.270 "seek_data": false, 00:06:51.270 "copy": true, 00:06:51.270 "nvme_iov_md": false 00:06:51.270 }, 00:06:51.270 "memory_domains": [ 00:06:51.270 { 00:06:51.270 "dma_device_id": "system", 00:06:51.270 "dma_device_type": 1 00:06:51.270 }, 00:06:51.270 { 00:06:51.270 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:51.270 "dma_device_type": 2 00:06:51.270 } 00:06:51.270 ], 00:06:51.270 "driver_specific": { 00:06:51.270 "passthru": { 00:06:51.270 "name": "Passthru0", 00:06:51.270 "base_bdev_name": "Malloc0" 00:06:51.270 } 00:06:51.270 } 00:06:51.270 } 00:06:51.270 ]' 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 13:15:31 rpc.rpc_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:51.270 13:15:31 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:51.270 13:15:32 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:51.270 00:06:51.270 real 0m0.315s 00:06:51.270 user 0m0.214s 00:06:51.270 sys 0m0.034s 00:06:51.270 13:15:32 rpc.rpc_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.270 13:15:32 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:51.270 ************************************ 00:06:51.270 END TEST rpc_integrity 00:06:51.270 ************************************ 00:06:51.530 13:15:32 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:06:51.530 13:15:32 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.530 13:15:32 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.530 13:15:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:51.530 ************************************ 00:06:51.530 START TEST rpc_plugins 00:06:51.530 ************************************ 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # rpc_plugins 00:06:51.530 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.530 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:06:51.530 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.530 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:06:51.530 { 00:06:51.530 "name": "Malloc1", 00:06:51.530 "aliases": [ 00:06:51.530 "fb5ae9b6-3994-4172-92e9-274c01554e64" 00:06:51.530 ], 00:06:51.530 "product_name": "Malloc disk", 00:06:51.530 "block_size": 4096, 00:06:51.530 "num_blocks": 256, 00:06:51.530 "uuid": "fb5ae9b6-3994-4172-92e9-274c01554e64", 00:06:51.530 "assigned_rate_limits": { 00:06:51.530 "rw_ios_per_sec": 0, 00:06:51.530 "rw_mbytes_per_sec": 0, 00:06:51.530 "r_mbytes_per_sec": 0, 00:06:51.530 "w_mbytes_per_sec": 0 00:06:51.530 }, 00:06:51.530 "claimed": false, 00:06:51.530 "zoned": false, 00:06:51.530 "supported_io_types": { 00:06:51.530 "read": true, 00:06:51.530 "write": true, 00:06:51.530 "unmap": true, 00:06:51.530 "flush": true, 00:06:51.530 "reset": true, 00:06:51.530 "nvme_admin": false, 00:06:51.530 "nvme_io": false, 00:06:51.530 "nvme_io_md": false, 00:06:51.530 "write_zeroes": true, 00:06:51.530 "zcopy": true, 00:06:51.530 "get_zone_info": false, 00:06:51.530 "zone_management": false, 00:06:51.530 "zone_append": false, 00:06:51.530 "compare": false, 00:06:51.530 "compare_and_write": false, 00:06:51.530 "abort": true, 00:06:51.530 "seek_hole": false, 00:06:51.530 "seek_data": false, 00:06:51.530 "copy": true, 00:06:51.530 "nvme_iov_md": false 00:06:51.530 }, 00:06:51.530 "memory_domains": [ 00:06:51.530 { 00:06:51.530 "dma_device_id": "system", 00:06:51.530 "dma_device_type": 1 00:06:51.530 }, 00:06:51.530 { 00:06:51.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:51.530 "dma_device_type": 2 00:06:51.530 } 00:06:51.530 ], 00:06:51.530 "driver_specific": {} 00:06:51.530 } 00:06:51.530 ]' 00:06:51.530 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:06:51.530 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:06:51.530 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:51.530 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.531 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:06:51.531 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.531 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:51.531 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.531 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:06:51.531 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:06:51.531 13:15:32 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:06:51.531 00:06:51.531 real 0m0.152s 00:06:51.531 user 0m0.091s 00:06:51.531 sys 0m0.024s 00:06:51.531 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:51.531 13:15:32 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:06:51.531 ************************************ 00:06:51.531 END TEST rpc_plugins 00:06:51.531 ************************************ 00:06:51.531 13:15:32 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:06:51.531 13:15:32 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:51.531 13:15:32 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:51.531 13:15:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:51.790 ************************************ 00:06:51.790 START TEST rpc_trace_cmd_test 00:06:51.790 ************************************ 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # rpc_trace_cmd_test 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:06:51.790 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid822234", 00:06:51.790 "tpoint_group_mask": "0x8", 00:06:51.790 "iscsi_conn": { 00:06:51.790 "mask": "0x2", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "scsi": { 00:06:51.790 "mask": "0x4", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "bdev": { 00:06:51.790 "mask": "0x8", 00:06:51.790 "tpoint_mask": "0xffffffffffffffff" 00:06:51.790 }, 00:06:51.790 "nvmf_rdma": { 00:06:51.790 "mask": "0x10", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "nvmf_tcp": { 00:06:51.790 "mask": "0x20", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "ftl": { 00:06:51.790 "mask": "0x40", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "blobfs": { 00:06:51.790 "mask": "0x80", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "dsa": { 00:06:51.790 "mask": "0x200", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "thread": { 00:06:51.790 "mask": "0x400", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "nvme_pcie": { 00:06:51.790 "mask": "0x800", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "iaa": { 00:06:51.790 "mask": "0x1000", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "nvme_tcp": { 00:06:51.790 "mask": "0x2000", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "bdev_nvme": { 00:06:51.790 "mask": "0x4000", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 }, 00:06:51.790 "sock": { 00:06:51.790 "mask": "0x8000", 00:06:51.790 "tpoint_mask": "0x0" 00:06:51.790 } 00:06:51.790 }' 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:06:51.790 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:06:52.050 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:06:52.050 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:06:52.050 13:15:32 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:06:52.050 00:06:52.050 real 0m0.328s 00:06:52.050 user 0m0.290s 00:06:52.050 sys 0m0.030s 00:06:52.050 13:15:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:52.050 13:15:32 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:06:52.050 ************************************ 00:06:52.050 END TEST rpc_trace_cmd_test 00:06:52.050 ************************************ 00:06:52.050 13:15:32 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:06:52.050 13:15:32 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:06:52.050 13:15:32 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:06:52.050 13:15:32 rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:52.050 13:15:32 rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:52.050 13:15:32 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:52.050 ************************************ 00:06:52.050 START TEST rpc_daemon_integrity 00:06:52.050 ************************************ 00:06:52.050 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # rpc_integrity 00:06:52.050 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:06:52.050 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.050 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.051 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.051 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:06:52.051 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:06:52.051 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:06:52.051 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:06:52.051 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.051 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:06:52.311 { 00:06:52.311 "name": "Malloc2", 00:06:52.311 "aliases": [ 00:06:52.311 "e6f5fa77-45f2-4b6b-b0c4-8e36d7bd87d2" 00:06:52.311 ], 00:06:52.311 "product_name": "Malloc disk", 00:06:52.311 "block_size": 512, 00:06:52.311 "num_blocks": 16384, 00:06:52.311 "uuid": "e6f5fa77-45f2-4b6b-b0c4-8e36d7bd87d2", 00:06:52.311 "assigned_rate_limits": { 00:06:52.311 "rw_ios_per_sec": 0, 00:06:52.311 "rw_mbytes_per_sec": 0, 00:06:52.311 "r_mbytes_per_sec": 0, 00:06:52.311 "w_mbytes_per_sec": 0 00:06:52.311 }, 00:06:52.311 "claimed": false, 00:06:52.311 "zoned": false, 00:06:52.311 "supported_io_types": { 00:06:52.311 "read": true, 00:06:52.311 "write": true, 00:06:52.311 "unmap": true, 00:06:52.311 "flush": true, 00:06:52.311 "reset": true, 00:06:52.311 "nvme_admin": false, 00:06:52.311 "nvme_io": false, 00:06:52.311 "nvme_io_md": false, 00:06:52.311 "write_zeroes": true, 00:06:52.311 "zcopy": true, 00:06:52.311 "get_zone_info": false, 00:06:52.311 "zone_management": false, 00:06:52.311 "zone_append": false, 00:06:52.311 "compare": false, 00:06:52.311 "compare_and_write": false, 00:06:52.311 "abort": true, 00:06:52.311 "seek_hole": false, 00:06:52.311 "seek_data": false, 00:06:52.311 "copy": true, 00:06:52.311 "nvme_iov_md": false 00:06:52.311 }, 00:06:52.311 "memory_domains": [ 00:06:52.311 { 00:06:52.311 "dma_device_id": "system", 00:06:52.311 "dma_device_type": 1 00:06:52.311 }, 00:06:52.311 { 00:06:52.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:52.311 "dma_device_type": 2 00:06:52.311 } 00:06:52.311 ], 00:06:52.311 "driver_specific": {} 00:06:52.311 } 00:06:52.311 ]' 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.311 [2024-07-25 13:15:32.911319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:06:52.311 [2024-07-25 13:15:32.911346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:06:52.311 [2024-07-25 13:15:32.911360] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2719880 00:06:52.311 [2024-07-25 13:15:32.911367] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:06:52.311 [2024-07-25 13:15:32.912513] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:06:52.311 [2024-07-25 13:15:32.912532] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:06:52.311 Passthru0 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.311 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:06:52.312 { 00:06:52.312 "name": "Malloc2", 00:06:52.312 "aliases": [ 00:06:52.312 "e6f5fa77-45f2-4b6b-b0c4-8e36d7bd87d2" 00:06:52.312 ], 00:06:52.312 "product_name": "Malloc disk", 00:06:52.312 "block_size": 512, 00:06:52.312 "num_blocks": 16384, 00:06:52.312 "uuid": "e6f5fa77-45f2-4b6b-b0c4-8e36d7bd87d2", 00:06:52.312 "assigned_rate_limits": { 00:06:52.312 "rw_ios_per_sec": 0, 00:06:52.312 "rw_mbytes_per_sec": 0, 00:06:52.312 "r_mbytes_per_sec": 0, 00:06:52.312 "w_mbytes_per_sec": 0 00:06:52.312 }, 00:06:52.312 "claimed": true, 00:06:52.312 "claim_type": "exclusive_write", 00:06:52.312 "zoned": false, 00:06:52.312 "supported_io_types": { 00:06:52.312 "read": true, 00:06:52.312 "write": true, 00:06:52.312 "unmap": true, 00:06:52.312 "flush": true, 00:06:52.312 "reset": true, 00:06:52.312 "nvme_admin": false, 00:06:52.312 "nvme_io": false, 00:06:52.312 "nvme_io_md": false, 00:06:52.312 "write_zeroes": true, 00:06:52.312 "zcopy": true, 00:06:52.312 "get_zone_info": false, 00:06:52.312 "zone_management": false, 00:06:52.312 "zone_append": false, 00:06:52.312 "compare": false, 00:06:52.312 "compare_and_write": false, 00:06:52.312 "abort": true, 00:06:52.312 "seek_hole": false, 00:06:52.312 "seek_data": false, 00:06:52.312 "copy": true, 00:06:52.312 "nvme_iov_md": false 00:06:52.312 }, 00:06:52.312 "memory_domains": [ 00:06:52.312 { 00:06:52.312 "dma_device_id": "system", 00:06:52.312 "dma_device_type": 1 00:06:52.312 }, 00:06:52.312 { 00:06:52.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:52.312 "dma_device_type": 2 00:06:52.312 } 00:06:52.312 ], 00:06:52.312 "driver_specific": {} 00:06:52.312 }, 00:06:52.312 { 00:06:52.312 "name": "Passthru0", 00:06:52.312 "aliases": [ 00:06:52.312 "97ff94bd-a8c0-59d5-ab20-78c11dae0946" 00:06:52.312 ], 00:06:52.312 "product_name": "passthru", 00:06:52.312 "block_size": 512, 00:06:52.312 "num_blocks": 16384, 00:06:52.312 "uuid": "97ff94bd-a8c0-59d5-ab20-78c11dae0946", 00:06:52.312 "assigned_rate_limits": { 00:06:52.312 "rw_ios_per_sec": 0, 00:06:52.312 "rw_mbytes_per_sec": 0, 00:06:52.312 "r_mbytes_per_sec": 0, 00:06:52.312 "w_mbytes_per_sec": 0 00:06:52.312 }, 00:06:52.312 "claimed": false, 00:06:52.312 "zoned": false, 00:06:52.312 "supported_io_types": { 00:06:52.312 "read": true, 00:06:52.312 "write": true, 00:06:52.312 "unmap": true, 00:06:52.312 "flush": true, 00:06:52.312 "reset": true, 00:06:52.312 "nvme_admin": false, 00:06:52.312 "nvme_io": false, 00:06:52.312 "nvme_io_md": false, 00:06:52.312 "write_zeroes": true, 00:06:52.312 "zcopy": true, 00:06:52.312 "get_zone_info": false, 00:06:52.312 "zone_management": false, 00:06:52.312 "zone_append": false, 00:06:52.312 "compare": false, 00:06:52.312 "compare_and_write": false, 00:06:52.312 "abort": true, 00:06:52.312 "seek_hole": false, 00:06:52.312 "seek_data": false, 00:06:52.312 "copy": true, 00:06:52.312 "nvme_iov_md": false 00:06:52.312 }, 00:06:52.312 "memory_domains": [ 00:06:52.312 { 00:06:52.312 "dma_device_id": "system", 00:06:52.312 "dma_device_type": 1 00:06:52.312 }, 00:06:52.312 { 00:06:52.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:06:52.312 "dma_device_type": 2 00:06:52.312 } 00:06:52.312 ], 00:06:52.312 "driver_specific": { 00:06:52.312 "passthru": { 00:06:52.312 "name": "Passthru0", 00:06:52.312 "base_bdev_name": "Malloc2" 00:06:52.312 } 00:06:52.312 } 00:06:52.312 } 00:06:52.312 ]' 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.312 13:15:32 rpc.rpc_daemon_integrity -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:52.312 13:15:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:06:52.312 13:15:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:06:52.312 13:15:33 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:06:52.312 00:06:52.312 real 0m0.296s 00:06:52.312 user 0m0.203s 00:06:52.312 sys 0m0.040s 00:06:52.312 13:15:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:52.312 13:15:33 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:06:52.312 ************************************ 00:06:52.312 END TEST rpc_daemon_integrity 00:06:52.312 ************************************ 00:06:52.312 13:15:33 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:06:52.312 13:15:33 rpc -- rpc/rpc.sh@84 -- # killprocess 822234 00:06:52.312 13:15:33 rpc -- common/autotest_common.sh@950 -- # '[' -z 822234 ']' 00:06:52.312 13:15:33 rpc -- common/autotest_common.sh@954 -- # kill -0 822234 00:06:52.312 13:15:33 rpc -- common/autotest_common.sh@955 -- # uname 00:06:52.312 13:15:33 rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:52.312 13:15:33 rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 822234 00:06:52.572 13:15:33 rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:52.572 13:15:33 rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:52.572 13:15:33 rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 822234' 00:06:52.572 killing process with pid 822234 00:06:52.572 13:15:33 rpc -- common/autotest_common.sh@969 -- # kill 822234 00:06:52.572 13:15:33 rpc -- common/autotest_common.sh@974 -- # wait 822234 00:06:52.572 00:06:52.572 real 0m2.687s 00:06:52.572 user 0m3.644s 00:06:52.572 sys 0m0.744s 00:06:52.572 13:15:33 rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:52.572 13:15:33 rpc -- common/autotest_common.sh@10 -- # set +x 00:06:52.572 ************************************ 00:06:52.572 END TEST rpc 00:06:52.572 ************************************ 00:06:52.572 13:15:33 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:52.572 13:15:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:52.572 13:15:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:52.572 13:15:33 -- common/autotest_common.sh@10 -- # set +x 00:06:52.832 ************************************ 00:06:52.832 START TEST skip_rpc 00:06:52.832 ************************************ 00:06:52.832 13:15:33 skip_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:06:52.832 * Looking for test storage... 00:06:52.832 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:06:52.832 13:15:33 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:52.832 13:15:33 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:06:52.832 13:15:33 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:06:52.832 13:15:33 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:52.832 13:15:33 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:52.832 13:15:33 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:52.832 ************************************ 00:06:52.832 START TEST skip_rpc 00:06:52.832 ************************************ 00:06:52.832 13:15:33 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # test_skip_rpc 00:06:52.832 13:15:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=822744 00:06:52.832 13:15:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:52.832 13:15:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:06:52.832 13:15:33 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:06:52.832 [2024-07-25 13:15:33.607936] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:06:52.832 [2024-07-25 13:15:33.607997] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid822744 ] 00:06:53.091 [2024-07-25 13:15:33.701418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:53.091 [2024-07-25 13:15:33.776574] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.372 13:15:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:06:58.372 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@650 -- # local es=0 00:06:58.372 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # valid_exec_arg rpc_cmd spdk_get_version 00:06:58.372 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@638 -- # local arg=rpc_cmd 00:06:58.372 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.372 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # type -t rpc_cmd 00:06:58.372 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # rpc_cmd spdk_get_version 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@653 -- # es=1 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 822744 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@950 -- # '[' -z 822744 ']' 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # kill -0 822744 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # uname 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 822744 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 822744' 00:06:58.373 killing process with pid 822744 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@969 -- # kill 822744 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@974 -- # wait 822744 00:06:58.373 00:06:58.373 real 0m5.269s 00:06:58.373 user 0m5.042s 00:06:58.373 sys 0m0.247s 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:06:58.373 13:15:38 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:58.373 ************************************ 00:06:58.373 END TEST skip_rpc 00:06:58.373 ************************************ 00:06:58.373 13:15:38 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:06:58.373 13:15:38 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:06:58.373 13:15:38 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:06:58.373 13:15:38 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:58.373 ************************************ 00:06:58.373 START TEST skip_rpc_with_json 00:06:58.373 ************************************ 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_json 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=823684 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 823684 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@831 -- # '[' -z 823684 ']' 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@836 -- # local max_retries=100 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:58.373 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@840 -- # xtrace_disable 00:06:58.373 13:15:38 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:58.373 [2024-07-25 13:15:38.935360] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:06:58.373 [2024-07-25 13:15:38.935404] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid823684 ] 00:06:58.373 [2024-07-25 13:15:39.022021] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.373 [2024-07-25 13:15:39.085468] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@864 -- # return 0 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:59.313 [2024-07-25 13:15:39.788704] nvmf_rpc.c:2569:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:06:59.313 request: 00:06:59.313 { 00:06:59.313 "trtype": "tcp", 00:06:59.313 "method": "nvmf_get_transports", 00:06:59.313 "req_id": 1 00:06:59.313 } 00:06:59.313 Got JSON-RPC error response 00:06:59.313 response: 00:06:59.313 { 00:06:59.313 "code": -19, 00:06:59.313 "message": "No such device" 00:06:59.313 } 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 1 == 0 ]] 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:59.313 [2024-07-25 13:15:39.800825] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:06:59.313 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@561 -- # xtrace_disable 00:06:59.314 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:06:59.314 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:06:59.314 13:15:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:06:59.314 { 00:06:59.314 "subsystems": [ 00:06:59.314 { 00:06:59.314 "subsystem": "keyring", 00:06:59.314 "config": [] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "iobuf", 00:06:59.314 "config": [ 00:06:59.314 { 00:06:59.314 "method": "iobuf_set_options", 00:06:59.314 "params": { 00:06:59.314 "small_pool_count": 8192, 00:06:59.314 "large_pool_count": 1024, 00:06:59.314 "small_bufsize": 8192, 00:06:59.314 "large_bufsize": 135168 00:06:59.314 } 00:06:59.314 } 00:06:59.314 ] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "sock", 00:06:59.314 "config": [ 00:06:59.314 { 00:06:59.314 "method": "sock_set_default_impl", 00:06:59.314 "params": { 00:06:59.314 "impl_name": "posix" 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "sock_impl_set_options", 00:06:59.314 "params": { 00:06:59.314 "impl_name": "ssl", 00:06:59.314 "recv_buf_size": 4096, 00:06:59.314 "send_buf_size": 4096, 00:06:59.314 "enable_recv_pipe": true, 00:06:59.314 "enable_quickack": false, 00:06:59.314 "enable_placement_id": 0, 00:06:59.314 "enable_zerocopy_send_server": true, 00:06:59.314 "enable_zerocopy_send_client": false, 00:06:59.314 "zerocopy_threshold": 0, 00:06:59.314 "tls_version": 0, 00:06:59.314 "enable_ktls": false 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "sock_impl_set_options", 00:06:59.314 "params": { 00:06:59.314 "impl_name": "posix", 00:06:59.314 "recv_buf_size": 2097152, 00:06:59.314 "send_buf_size": 2097152, 00:06:59.314 "enable_recv_pipe": true, 00:06:59.314 "enable_quickack": false, 00:06:59.314 "enable_placement_id": 0, 00:06:59.314 "enable_zerocopy_send_server": true, 00:06:59.314 "enable_zerocopy_send_client": false, 00:06:59.314 "zerocopy_threshold": 0, 00:06:59.314 "tls_version": 0, 00:06:59.314 "enable_ktls": false 00:06:59.314 } 00:06:59.314 } 00:06:59.314 ] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "vmd", 00:06:59.314 "config": [] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "accel", 00:06:59.314 "config": [ 00:06:59.314 { 00:06:59.314 "method": "accel_set_options", 00:06:59.314 "params": { 00:06:59.314 "small_cache_size": 128, 00:06:59.314 "large_cache_size": 16, 00:06:59.314 "task_count": 2048, 00:06:59.314 "sequence_count": 2048, 00:06:59.314 "buf_count": 2048 00:06:59.314 } 00:06:59.314 } 00:06:59.314 ] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "bdev", 00:06:59.314 "config": [ 00:06:59.314 { 00:06:59.314 "method": "bdev_set_options", 00:06:59.314 "params": { 00:06:59.314 "bdev_io_pool_size": 65535, 00:06:59.314 "bdev_io_cache_size": 256, 00:06:59.314 "bdev_auto_examine": true, 00:06:59.314 "iobuf_small_cache_size": 128, 00:06:59.314 "iobuf_large_cache_size": 16 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "bdev_raid_set_options", 00:06:59.314 "params": { 00:06:59.314 "process_window_size_kb": 1024, 00:06:59.314 "process_max_bandwidth_mb_sec": 0 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "bdev_iscsi_set_options", 00:06:59.314 "params": { 00:06:59.314 "timeout_sec": 30 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "bdev_nvme_set_options", 00:06:59.314 "params": { 00:06:59.314 "action_on_timeout": "none", 00:06:59.314 "timeout_us": 0, 00:06:59.314 "timeout_admin_us": 0, 00:06:59.314 "keep_alive_timeout_ms": 10000, 00:06:59.314 "arbitration_burst": 0, 00:06:59.314 "low_priority_weight": 0, 00:06:59.314 "medium_priority_weight": 0, 00:06:59.314 "high_priority_weight": 0, 00:06:59.314 "nvme_adminq_poll_period_us": 10000, 00:06:59.314 "nvme_ioq_poll_period_us": 0, 00:06:59.314 "io_queue_requests": 0, 00:06:59.314 "delay_cmd_submit": true, 00:06:59.314 "transport_retry_count": 4, 00:06:59.314 "bdev_retry_count": 3, 00:06:59.314 "transport_ack_timeout": 0, 00:06:59.314 "ctrlr_loss_timeout_sec": 0, 00:06:59.314 "reconnect_delay_sec": 0, 00:06:59.314 "fast_io_fail_timeout_sec": 0, 00:06:59.314 "disable_auto_failback": false, 00:06:59.314 "generate_uuids": false, 00:06:59.314 "transport_tos": 0, 00:06:59.314 "nvme_error_stat": false, 00:06:59.314 "rdma_srq_size": 0, 00:06:59.314 "io_path_stat": false, 00:06:59.314 "allow_accel_sequence": false, 00:06:59.314 "rdma_max_cq_size": 0, 00:06:59.314 "rdma_cm_event_timeout_ms": 0, 00:06:59.314 "dhchap_digests": [ 00:06:59.314 "sha256", 00:06:59.314 "sha384", 00:06:59.314 "sha512" 00:06:59.314 ], 00:06:59.314 "dhchap_dhgroups": [ 00:06:59.314 "null", 00:06:59.314 "ffdhe2048", 00:06:59.314 "ffdhe3072", 00:06:59.314 "ffdhe4096", 00:06:59.314 "ffdhe6144", 00:06:59.314 "ffdhe8192" 00:06:59.314 ] 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "bdev_nvme_set_hotplug", 00:06:59.314 "params": { 00:06:59.314 "period_us": 100000, 00:06:59.314 "enable": false 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "bdev_wait_for_examine" 00:06:59.314 } 00:06:59.314 ] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "scsi", 00:06:59.314 "config": null 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "scheduler", 00:06:59.314 "config": [ 00:06:59.314 { 00:06:59.314 "method": "framework_set_scheduler", 00:06:59.314 "params": { 00:06:59.314 "name": "static" 00:06:59.314 } 00:06:59.314 } 00:06:59.314 ] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "vhost_scsi", 00:06:59.314 "config": [] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "vhost_blk", 00:06:59.314 "config": [] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "ublk", 00:06:59.314 "config": [] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "nbd", 00:06:59.314 "config": [] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "nvmf", 00:06:59.314 "config": [ 00:06:59.314 { 00:06:59.314 "method": "nvmf_set_config", 00:06:59.314 "params": { 00:06:59.314 "discovery_filter": "match_any", 00:06:59.314 "admin_cmd_passthru": { 00:06:59.314 "identify_ctrlr": false 00:06:59.314 } 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "nvmf_set_max_subsystems", 00:06:59.314 "params": { 00:06:59.314 "max_subsystems": 1024 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "nvmf_set_crdt", 00:06:59.314 "params": { 00:06:59.314 "crdt1": 0, 00:06:59.314 "crdt2": 0, 00:06:59.314 "crdt3": 0 00:06:59.314 } 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "method": "nvmf_create_transport", 00:06:59.314 "params": { 00:06:59.314 "trtype": "TCP", 00:06:59.314 "max_queue_depth": 128, 00:06:59.314 "max_io_qpairs_per_ctrlr": 127, 00:06:59.314 "in_capsule_data_size": 4096, 00:06:59.314 "max_io_size": 131072, 00:06:59.314 "io_unit_size": 131072, 00:06:59.314 "max_aq_depth": 128, 00:06:59.314 "num_shared_buffers": 511, 00:06:59.314 "buf_cache_size": 4294967295, 00:06:59.314 "dif_insert_or_strip": false, 00:06:59.314 "zcopy": false, 00:06:59.314 "c2h_success": true, 00:06:59.314 "sock_priority": 0, 00:06:59.314 "abort_timeout_sec": 1, 00:06:59.314 "ack_timeout": 0, 00:06:59.314 "data_wr_pool_size": 0 00:06:59.314 } 00:06:59.314 } 00:06:59.314 ] 00:06:59.314 }, 00:06:59.314 { 00:06:59.314 "subsystem": "iscsi", 00:06:59.314 "config": [ 00:06:59.314 { 00:06:59.314 "method": "iscsi_set_options", 00:06:59.314 "params": { 00:06:59.314 "node_base": "iqn.2016-06.io.spdk", 00:06:59.314 "max_sessions": 128, 00:06:59.314 "max_connections_per_session": 2, 00:06:59.314 "max_queue_depth": 64, 00:06:59.314 "default_time2wait": 2, 00:06:59.314 "default_time2retain": 20, 00:06:59.314 "first_burst_length": 8192, 00:06:59.314 "immediate_data": true, 00:06:59.314 "allow_duplicated_isid": false, 00:06:59.314 "error_recovery_level": 0, 00:06:59.314 "nop_timeout": 60, 00:06:59.314 "nop_in_interval": 30, 00:06:59.314 "disable_chap": false, 00:06:59.314 "require_chap": false, 00:06:59.314 "mutual_chap": false, 00:06:59.314 "chap_group": 0, 00:06:59.314 "max_large_datain_per_connection": 64, 00:06:59.314 "max_r2t_per_connection": 4, 00:06:59.314 "pdu_pool_size": 36864, 00:06:59.314 "immediate_data_pool_size": 16384, 00:06:59.314 "data_out_pool_size": 2048 00:06:59.314 } 00:06:59.314 } 00:06:59.314 ] 00:06:59.314 } 00:06:59.314 ] 00:06:59.315 } 00:06:59.315 13:15:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:06:59.315 13:15:39 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 823684 00:06:59.315 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 823684 ']' 00:06:59.315 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 823684 00:06:59.315 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:06:59.315 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:06:59.315 13:15:39 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 823684 00:06:59.315 13:15:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:06:59.315 13:15:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:06:59.315 13:15:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 823684' 00:06:59.315 killing process with pid 823684 00:06:59.315 13:15:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 823684 00:06:59.315 13:15:40 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 823684 00:06:59.574 13:15:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=823977 00:06:59.574 13:15:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:06:59.574 13:15:40 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 823977 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@950 -- # '[' -z 823977 ']' 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # kill -0 823977 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # uname 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 823977 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:04.930 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # echo 'killing process with pid 823977' 00:07:04.931 killing process with pid 823977 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@969 -- # kill 823977 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@974 -- # wait 823977 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:07:04.931 00:07:04.931 real 0m6.611s 00:07:04.931 user 0m6.524s 00:07:04.931 sys 0m0.563s 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:07:04.931 ************************************ 00:07:04.931 END TEST skip_rpc_with_json 00:07:04.931 ************************************ 00:07:04.931 13:15:45 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:07:04.931 13:15:45 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:04.931 13:15:45 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:04.931 13:15:45 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:04.931 ************************************ 00:07:04.931 START TEST skip_rpc_with_delay 00:07:04.931 ************************************ 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # test_skip_rpc_with_delay 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@650 -- # local es=0 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:07:04.931 [2024-07-25 13:15:45.629009] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:07:04.931 [2024-07-25 13:15:45.629078] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@653 -- # es=1 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:04.931 00:07:04.931 real 0m0.077s 00:07:04.931 user 0m0.052s 00:07:04.931 sys 0m0.025s 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:04.931 13:15:45 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:07:04.931 ************************************ 00:07:04.931 END TEST skip_rpc_with_delay 00:07:04.931 ************************************ 00:07:04.931 13:15:45 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:07:04.931 13:15:45 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:07:04.931 13:15:45 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:07:04.931 13:15:45 skip_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:04.931 13:15:45 skip_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:04.931 13:15:45 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:05.190 ************************************ 00:07:05.190 START TEST exit_on_failed_rpc_init 00:07:05.190 ************************************ 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # test_exit_on_failed_rpc_init 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=824955 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 824955 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@831 -- # '[' -z 824955 ']' 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:05.190 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:05.190 13:15:45 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:05.190 [2024-07-25 13:15:45.788393] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:05.190 [2024-07-25 13:15:45.788449] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid824955 ] 00:07:05.190 [2024-07-25 13:15:45.879232] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.190 [2024-07-25 13:15:45.946614] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@864 -- # return 0 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@650 -- # local es=0 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:07:06.127 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:07:06.127 [2024-07-25 13:15:46.716445] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:06.127 [2024-07-25 13:15:46.716494] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid825186 ] 00:07:06.127 [2024-07-25 13:15:46.799535] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:06.127 [2024-07-25 13:15:46.878384] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:06.127 [2024-07-25 13:15:46.878457] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:07:06.127 [2024-07-25 13:15:46.878470] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:07:06.127 [2024-07-25 13:15:46.878480] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@653 -- # es=234 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # es=106 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@663 -- # case "$es" in 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@670 -- # es=1 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 824955 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@950 -- # '[' -z 824955 ']' 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # kill -0 824955 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # uname 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:06.387 13:15:46 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 824955 00:07:06.387 13:15:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:06.387 13:15:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:06.387 13:15:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # echo 'killing process with pid 824955' 00:07:06.387 killing process with pid 824955 00:07:06.387 13:15:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@969 -- # kill 824955 00:07:06.387 13:15:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@974 -- # wait 824955 00:07:06.647 00:07:06.647 real 0m1.488s 00:07:06.647 user 0m1.831s 00:07:06.647 sys 0m0.399s 00:07:06.647 13:15:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.647 13:15:47 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:07:06.647 ************************************ 00:07:06.647 END TEST exit_on_failed_rpc_init 00:07:06.647 ************************************ 00:07:06.647 13:15:47 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:07:06.647 00:07:06.647 real 0m13.861s 00:07:06.647 user 0m13.625s 00:07:06.647 sys 0m1.499s 00:07:06.647 13:15:47 skip_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.647 13:15:47 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:06.647 ************************************ 00:07:06.647 END TEST skip_rpc 00:07:06.647 ************************************ 00:07:06.647 13:15:47 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:06.647 13:15:47 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:06.647 13:15:47 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:06.647 13:15:47 -- common/autotest_common.sh@10 -- # set +x 00:07:06.647 ************************************ 00:07:06.647 START TEST rpc_client 00:07:06.647 ************************************ 00:07:06.647 13:15:47 rpc_client -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:07:06.647 * Looking for test storage... 00:07:06.647 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:07:06.647 13:15:47 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:07:06.909 OK 00:07:06.909 13:15:47 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:07:06.909 00:07:06.909 real 0m0.128s 00:07:06.909 user 0m0.058s 00:07:06.909 sys 0m0.078s 00:07:06.909 13:15:47 rpc_client -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:06.909 13:15:47 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:07:06.909 ************************************ 00:07:06.909 END TEST rpc_client 00:07:06.909 ************************************ 00:07:06.909 13:15:47 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:06.909 13:15:47 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:06.909 13:15:47 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:06.909 13:15:47 -- common/autotest_common.sh@10 -- # set +x 00:07:06.909 ************************************ 00:07:06.909 START TEST json_config 00:07:06.909 ************************************ 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@7 -- # uname -s 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:06.909 13:15:47 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:06.909 13:15:47 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:06.909 13:15:47 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:06.909 13:15:47 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.909 13:15:47 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.909 13:15:47 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.909 13:15:47 json_config -- paths/export.sh@5 -- # export PATH 00:07:06.909 13:15:47 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@47 -- # : 0 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:06.909 13:15:47 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@359 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@360 -- # echo 'INFO: JSON configuration test init' 00:07:06.909 INFO: JSON configuration test init 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@361 -- # json_config_test_init 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@266 -- # timing_enter json_config_test_init 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@267 -- # timing_enter json_config_setup_target 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:06.909 13:15:47 json_config -- json_config/json_config.sh@269 -- # json_config_test_start_app target --wait-for-rpc 00:07:06.909 13:15:47 json_config -- json_config/common.sh@9 -- # local app=target 00:07:06.909 13:15:47 json_config -- json_config/common.sh@10 -- # shift 00:07:06.909 13:15:47 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:06.909 13:15:47 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:06.909 13:15:47 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:06.909 13:15:47 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:06.909 13:15:47 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:06.909 13:15:47 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=825378 00:07:06.909 13:15:47 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:06.909 Waiting for target to run... 00:07:06.909 13:15:47 json_config -- json_config/common.sh@25 -- # waitforlisten 825378 /var/tmp/spdk_tgt.sock 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@831 -- # '[' -z 825378 ']' 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:06.909 13:15:47 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:06.909 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:06.909 13:15:47 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:07.170 [2024-07-25 13:15:47.707520] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:07.170 [2024-07-25 13:15:47.707574] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid825378 ] 00:07:07.430 [2024-07-25 13:15:48.005810] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.430 [2024-07-25 13:15:48.057370] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:08.000 13:15:48 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:08.000 13:15:48 json_config -- common/autotest_common.sh@864 -- # return 0 00:07:08.000 13:15:48 json_config -- json_config/common.sh@26 -- # echo '' 00:07:08.000 00:07:08.000 13:15:48 json_config -- json_config/json_config.sh@273 -- # create_accel_config 00:07:08.000 13:15:48 json_config -- json_config/json_config.sh@97 -- # timing_enter create_accel_config 00:07:08.000 13:15:48 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:08.000 13:15:48 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:08.000 13:15:48 json_config -- json_config/json_config.sh@99 -- # [[ 1 -eq 1 ]] 00:07:08.000 13:15:48 json_config -- json_config/json_config.sh@100 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:07:08.000 13:15:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:07:08.000 13:15:48 json_config -- json_config/json_config.sh@101 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:08.000 13:15:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:07:08.259 [2024-07-25 13:15:48.887693] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:08.259 13:15:48 json_config -- json_config/json_config.sh@102 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:08.259 13:15:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:07:08.259 [2024-07-25 13:15:49.044074] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:08.519 13:15:49 json_config -- json_config/json_config.sh@105 -- # timing_exit create_accel_config 00:07:08.519 13:15:49 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:08.519 13:15:49 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:08.519 13:15:49 json_config -- json_config/json_config.sh@277 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:07:08.519 13:15:49 json_config -- json_config/json_config.sh@278 -- # tgt_rpc load_config 00:07:08.519 13:15:49 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:07:08.519 [2024-07-25 13:15:49.300677] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@280 -- # tgt_check_notification_types 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:07:13.799 13:15:54 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:13.799 13:15:54 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:07:13.799 13:15:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@48 -- # local get_types 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@50 -- # local type_diff 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@51 -- # echo bdev_register bdev_unregister bdev_register bdev_unregister 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@51 -- # tr ' ' '\n' 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@51 -- # sort 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@51 -- # uniq -u 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@51 -- # type_diff= 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@53 -- # [[ -n '' ]] 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@58 -- # timing_exit tgt_check_notification_types 00:07:13.799 13:15:54 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:13.799 13:15:54 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@59 -- # return 0 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@282 -- # [[ 1 -eq 1 ]] 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@283 -- # create_bdev_subsystem_config 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@109 -- # timing_enter create_bdev_subsystem_config 00:07:13.799 13:15:54 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:13.799 13:15:54 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@111 -- # expected_notifications=() 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@111 -- # local expected_notifications 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@115 -- # expected_notifications+=($(get_notifications)) 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@115 -- # get_notifications 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:07:13.799 13:15:54 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:13.799 13:15:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:14.059 13:15:54 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:07:14.059 13:15:54 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:14.059 13:15:54 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:14.059 13:15:54 json_config -- json_config/json_config.sh@117 -- # [[ 1 -eq 1 ]] 00:07:14.059 13:15:54 json_config -- json_config/json_config.sh@118 -- # local lvol_store_base_bdev=Nvme0n1 00:07:14.059 13:15:54 json_config -- json_config/json_config.sh@120 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:07:14.059 13:15:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:07:14.319 Nvme0n1p0 Nvme0n1p1 00:07:14.319 13:15:54 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_split_create Malloc0 3 00:07:14.319 13:15:54 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:07:14.580 [2024-07-25 13:15:55.153116] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:14.580 [2024-07-25 13:15:55.153156] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:14.580 00:07:14.580 13:15:55 json_config -- json_config/json_config.sh@122 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:07:14.580 13:15:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:07:14.580 Malloc3 00:07:14.580 13:15:55 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:14.580 13:15:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:07:14.841 [2024-07-25 13:15:55.522117] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:14.841 [2024-07-25 13:15:55.522150] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:14.841 [2024-07-25 13:15:55.522163] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12ebb20 00:07:14.841 [2024-07-25 13:15:55.522169] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:14.841 [2024-07-25 13:15:55.523396] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:14.841 [2024-07-25 13:15:55.523415] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:14.841 PTBdevFromMalloc3 00:07:14.841 13:15:55 json_config -- json_config/json_config.sh@125 -- # tgt_rpc bdev_null_create Null0 32 512 00:07:14.841 13:15:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:07:15.101 Null0 00:07:15.101 13:15:55 json_config -- json_config/json_config.sh@127 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:07:15.101 13:15:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:07:15.101 Malloc0 00:07:15.361 13:15:55 json_config -- json_config/json_config.sh@128 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:07:15.361 13:15:55 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:07:15.361 Malloc1 00:07:15.361 13:15:56 json_config -- json_config/json_config.sh@141 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:07:15.361 13:15:56 json_config -- json_config/json_config.sh@144 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:07:15.621 102400+0 records in 00:07:15.621 102400+0 records out 00:07:15.621 104857600 bytes (105 MB, 100 MiB) copied, 0.118922 s, 882 MB/s 00:07:15.621 13:15:56 json_config -- json_config/json_config.sh@145 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:07:15.621 13:15:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:07:15.621 aio_disk 00:07:15.621 13:15:56 json_config -- json_config/json_config.sh@146 -- # expected_notifications+=(bdev_register:aio_disk) 00:07:15.621 13:15:56 json_config -- json_config/json_config.sh@151 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:15.621 13:15:56 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:07:19.826 45ca3794-f98d-42e3-b05f-7c52e40f25e2 00:07:19.826 13:16:00 json_config -- json_config/json_config.sh@158 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:07:19.826 13:16:00 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:07:19.826 13:16:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:07:20.086 13:16:00 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:07:20.086 13:16:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:07:20.086 13:16:00 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:20.086 13:16:00 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:07:20.346 13:16:01 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:20.346 13:16:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:07:20.606 13:16:01 json_config -- json_config/json_config.sh@161 -- # [[ 1 -eq 1 ]] 00:07:20.606 13:16:01 json_config -- json_config/json_config.sh@162 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:20.606 13:16:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:07:20.866 MallocForCryptoBdev 00:07:20.866 13:16:01 json_config -- json_config/json_config.sh@163 -- # lspci -d:37c8 00:07:20.866 13:16:01 json_config -- json_config/json_config.sh@163 -- # wc -l 00:07:20.866 13:16:01 json_config -- json_config/json_config.sh@163 -- # [[ 3 -eq 0 ]] 00:07:20.866 13:16:01 json_config -- json_config/json_config.sh@166 -- # local crypto_driver=crypto_qat 00:07:20.866 13:16:01 json_config -- json_config/json_config.sh@169 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:20.866 13:16:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:07:21.127 [2024-07-25 13:16:01.659565] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:07:21.127 CryptoMallocBdev 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@173 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@176 -- # [[ 0 -eq 1 ]] 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@182 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:b87b648e-9cc7-47d0-9f9f-83eec4b85f2a bdev_register:43d88846-ec9a-41c0-9eaf-09611a01f61a bdev_register:794ffbf2-0d97-470a-b9a6-742751a9945c bdev_register:71737ede-0198-4c7e-a8fc-416e823a1639 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@71 -- # local events_to_check 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@72 -- # local recorded_events 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@75 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@75 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:b87b648e-9cc7-47d0-9f9f-83eec4b85f2a bdev_register:43d88846-ec9a-41c0-9eaf-09611a01f61a bdev_register:794ffbf2-0d97-470a-b9a6-742751a9945c bdev_register:71737ede-0198-4c7e-a8fc-416e823a1639 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@75 -- # sort 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@76 -- # recorded_events=($(get_notifications | sort)) 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@76 -- # get_notifications 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@76 -- # sort 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@63 -- # local ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@62 -- # tgt_rpc notify_get_notifications -i 0 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@62 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:07:21.127 13:16:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p1 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Nvme0n1p0 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc3 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:PTBdevFromMalloc3 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Null0 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p2 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p1 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc0p0 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:Malloc1 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:aio_disk 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:b87b648e-9cc7-47d0-9f9f-83eec4b85f2a 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:43d88846-ec9a-41c0-9eaf-09611a01f61a 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:794ffbf2-0d97-470a-b9a6-742751a9945c 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:71737ede-0198-4c7e-a8fc-416e823a1639 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:MallocForCryptoBdev 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@66 -- # echo bdev_register:CryptoMallocBdev 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # IFS=: 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@65 -- # read -r ev_type ev_ctx event_id 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@78 -- # [[ bdev_register:43d88846-ec9a-41c0-9eaf-09611a01f61a bdev_register:71737ede-0198-4c7e-a8fc-416e823a1639 bdev_register:794ffbf2-0d97-470a-b9a6-742751a9945c bdev_register:aio_disk bdev_register:b87b648e-9cc7-47d0-9f9f-83eec4b85f2a bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\4\3\d\8\8\8\4\6\-\e\c\9\a\-\4\1\c\0\-\9\e\a\f\-\0\9\6\1\1\a\0\1\f\6\1\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\1\7\3\7\e\d\e\-\0\1\9\8\-\4\c\7\e\-\a\8\f\c\-\4\1\6\e\8\2\3\a\1\6\3\9\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\9\4\f\f\b\f\2\-\0\d\9\7\-\4\7\0\a\-\b\9\a\6\-\7\4\2\7\5\1\a\9\9\4\5\c\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\8\7\b\6\4\8\e\-\9\c\c\7\-\4\7\d\0\-\9\f\9\f\-\8\3\e\e\c\4\b\8\5\f\2\a\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@90 -- # cat 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@90 -- # printf ' %s\n' bdev_register:43d88846-ec9a-41c0-9eaf-09611a01f61a bdev_register:71737ede-0198-4c7e-a8fc-416e823a1639 bdev_register:794ffbf2-0d97-470a-b9a6-742751a9945c bdev_register:aio_disk bdev_register:b87b648e-9cc7-47d0-9f9f-83eec4b85f2a bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:07:21.127 Expected events matched: 00:07:21.127 bdev_register:43d88846-ec9a-41c0-9eaf-09611a01f61a 00:07:21.127 bdev_register:71737ede-0198-4c7e-a8fc-416e823a1639 00:07:21.127 bdev_register:794ffbf2-0d97-470a-b9a6-742751a9945c 00:07:21.127 bdev_register:aio_disk 00:07:21.127 bdev_register:b87b648e-9cc7-47d0-9f9f-83eec4b85f2a 00:07:21.127 bdev_register:CryptoMallocBdev 00:07:21.127 bdev_register:Malloc0 00:07:21.127 bdev_register:Malloc0p0 00:07:21.127 bdev_register:Malloc0p1 00:07:21.127 bdev_register:Malloc0p2 00:07:21.127 bdev_register:Malloc1 00:07:21.127 bdev_register:Malloc3 00:07:21.127 bdev_register:MallocForCryptoBdev 00:07:21.127 bdev_register:Null0 00:07:21.127 bdev_register:Nvme0n1 00:07:21.127 bdev_register:Nvme0n1p0 00:07:21.127 bdev_register:Nvme0n1p1 00:07:21.127 bdev_register:PTBdevFromMalloc3 00:07:21.127 13:16:01 json_config -- json_config/json_config.sh@184 -- # timing_exit create_bdev_subsystem_config 00:07:21.127 13:16:01 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:21.128 13:16:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:21.128 13:16:01 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:07:21.128 13:16:01 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:07:21.128 13:16:01 json_config -- json_config/json_config.sh@294 -- # [[ 0 -eq 1 ]] 00:07:21.128 13:16:01 json_config -- json_config/json_config.sh@297 -- # timing_exit json_config_setup_target 00:07:21.128 13:16:01 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:21.128 13:16:01 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:21.388 13:16:01 json_config -- json_config/json_config.sh@299 -- # [[ 0 -eq 1 ]] 00:07:21.388 13:16:01 json_config -- json_config/json_config.sh@304 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:21.388 13:16:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:07:21.388 MallocBdevForConfigChangeCheck 00:07:21.388 13:16:02 json_config -- json_config/json_config.sh@306 -- # timing_exit json_config_test_init 00:07:21.388 13:16:02 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:21.388 13:16:02 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:21.648 13:16:02 json_config -- json_config/json_config.sh@363 -- # tgt_rpc save_config 00:07:21.648 13:16:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:21.908 13:16:02 json_config -- json_config/json_config.sh@365 -- # echo 'INFO: shutting down applications...' 00:07:21.908 INFO: shutting down applications... 00:07:21.908 13:16:02 json_config -- json_config/json_config.sh@366 -- # [[ 0 -eq 1 ]] 00:07:21.908 13:16:02 json_config -- json_config/json_config.sh@372 -- # json_config_clear target 00:07:21.908 13:16:02 json_config -- json_config/json_config.sh@336 -- # [[ -n 22 ]] 00:07:21.908 13:16:02 json_config -- json_config/json_config.sh@337 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:07:21.908 [2024-07-25 13:16:02.674512] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:07:24.450 Calling clear_iscsi_subsystem 00:07:24.450 Calling clear_nvmf_subsystem 00:07:24.450 Calling clear_nbd_subsystem 00:07:24.450 Calling clear_ublk_subsystem 00:07:24.450 Calling clear_vhost_blk_subsystem 00:07:24.450 Calling clear_vhost_scsi_subsystem 00:07:24.450 Calling clear_bdev_subsystem 00:07:24.450 13:16:05 json_config -- json_config/json_config.sh@341 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:07:24.450 13:16:05 json_config -- json_config/json_config.sh@347 -- # count=100 00:07:24.450 13:16:05 json_config -- json_config/json_config.sh@348 -- # '[' 100 -gt 0 ']' 00:07:24.450 13:16:05 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:24.450 13:16:05 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:07:24.450 13:16:05 json_config -- json_config/json_config.sh@349 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:07:24.710 13:16:05 json_config -- json_config/json_config.sh@349 -- # break 00:07:24.710 13:16:05 json_config -- json_config/json_config.sh@354 -- # '[' 100 -eq 0 ']' 00:07:24.710 13:16:05 json_config -- json_config/json_config.sh@373 -- # json_config_test_shutdown_app target 00:07:24.710 13:16:05 json_config -- json_config/common.sh@31 -- # local app=target 00:07:24.710 13:16:05 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:24.710 13:16:05 json_config -- json_config/common.sh@35 -- # [[ -n 825378 ]] 00:07:24.710 13:16:05 json_config -- json_config/common.sh@38 -- # kill -SIGINT 825378 00:07:24.710 13:16:05 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:24.710 13:16:05 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:24.710 13:16:05 json_config -- json_config/common.sh@41 -- # kill -0 825378 00:07:24.710 13:16:05 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:07:25.282 13:16:05 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:07:25.282 13:16:05 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:25.282 13:16:05 json_config -- json_config/common.sh@41 -- # kill -0 825378 00:07:25.282 13:16:05 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:25.282 13:16:05 json_config -- json_config/common.sh@43 -- # break 00:07:25.282 13:16:05 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:25.282 13:16:05 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:25.282 SPDK target shutdown done 00:07:25.282 13:16:05 json_config -- json_config/json_config.sh@375 -- # echo 'INFO: relaunching applications...' 00:07:25.282 INFO: relaunching applications... 00:07:25.282 13:16:05 json_config -- json_config/json_config.sh@376 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:25.282 13:16:05 json_config -- json_config/common.sh@9 -- # local app=target 00:07:25.282 13:16:05 json_config -- json_config/common.sh@10 -- # shift 00:07:25.282 13:16:05 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:25.282 13:16:05 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:25.282 13:16:05 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:07:25.282 13:16:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:25.282 13:16:05 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:25.282 13:16:05 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=828612 00:07:25.283 13:16:05 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:25.283 Waiting for target to run... 00:07:25.283 13:16:05 json_config -- json_config/common.sh@25 -- # waitforlisten 828612 /var/tmp/spdk_tgt.sock 00:07:25.283 13:16:05 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:25.283 13:16:05 json_config -- common/autotest_common.sh@831 -- # '[' -z 828612 ']' 00:07:25.283 13:16:05 json_config -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:25.283 13:16:05 json_config -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:25.283 13:16:05 json_config -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:25.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:25.283 13:16:05 json_config -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:25.283 13:16:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:25.283 [2024-07-25 13:16:06.015512] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:25.283 [2024-07-25 13:16:06.015573] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid828612 ] 00:07:25.864 [2024-07-25 13:16:06.447594] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.864 [2024-07-25 13:16:06.508639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:25.864 [2024-07-25 13:16:06.562481] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:07:25.864 [2024-07-25 13:16:06.570513] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:07:25.864 [2024-07-25 13:16:06.578531] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:07:26.125 [2024-07-25 13:16:06.658781] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:07:28.037 [2024-07-25 13:16:08.792999] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:28.037 [2024-07-25 13:16:08.793042] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:28.037 [2024-07-25 13:16:08.793050] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:28.037 [2024-07-25 13:16:08.801014] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:28.037 [2024-07-25 13:16:08.801032] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:07:28.037 [2024-07-25 13:16:08.809028] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:28.037 [2024-07-25 13:16:08.809044] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:07:28.037 [2024-07-25 13:16:08.817060] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:07:28.037 [2024-07-25 13:16:08.817079] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:07:28.037 [2024-07-25 13:16:08.817085] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:31.336 [2024-07-25 13:16:11.674518] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:31.336 [2024-07-25 13:16:11.674558] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:31.336 [2024-07-25 13:16:11.674568] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17a30a0 00:07:31.336 [2024-07-25 13:16:11.674574] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:31.336 [2024-07-25 13:16:11.674852] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:31.336 [2024-07-25 13:16:11.674872] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:07:31.336 13:16:11 json_config -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:31.336 13:16:11 json_config -- common/autotest_common.sh@864 -- # return 0 00:07:31.336 13:16:11 json_config -- json_config/common.sh@26 -- # echo '' 00:07:31.336 00:07:31.336 13:16:11 json_config -- json_config/json_config.sh@377 -- # [[ 0 -eq 1 ]] 00:07:31.336 13:16:11 json_config -- json_config/json_config.sh@381 -- # echo 'INFO: Checking if target configuration is the same...' 00:07:31.336 INFO: Checking if target configuration is the same... 00:07:31.336 13:16:11 json_config -- json_config/json_config.sh@382 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:31.336 13:16:11 json_config -- json_config/json_config.sh@382 -- # tgt_rpc save_config 00:07:31.336 13:16:11 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:31.336 + '[' 2 -ne 2 ']' 00:07:31.336 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:31.336 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:31.336 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:31.336 +++ basename /dev/fd/62 00:07:31.336 ++ mktemp /tmp/62.XXX 00:07:31.336 + tmp_file_1=/tmp/62.e5Q 00:07:31.336 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:31.336 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:31.336 + tmp_file_2=/tmp/spdk_tgt_config.json.MEE 00:07:31.336 + ret=0 00:07:31.336 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:31.596 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:31.596 + diff -u /tmp/62.e5Q /tmp/spdk_tgt_config.json.MEE 00:07:31.596 + echo 'INFO: JSON config files are the same' 00:07:31.596 INFO: JSON config files are the same 00:07:31.596 + rm /tmp/62.e5Q /tmp/spdk_tgt_config.json.MEE 00:07:31.596 + exit 0 00:07:31.596 13:16:12 json_config -- json_config/json_config.sh@383 -- # [[ 0 -eq 1 ]] 00:07:31.596 13:16:12 json_config -- json_config/json_config.sh@388 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:07:31.596 INFO: changing configuration and checking if this can be detected... 00:07:31.596 13:16:12 json_config -- json_config/json_config.sh@390 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:31.596 13:16:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:07:31.857 13:16:12 json_config -- json_config/json_config.sh@391 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:31.857 13:16:12 json_config -- json_config/json_config.sh@391 -- # tgt_rpc save_config 00:07:31.857 13:16:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:07:31.857 + '[' 2 -ne 2 ']' 00:07:31.857 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:07:31.857 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:07:31.857 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:31.857 +++ basename /dev/fd/62 00:07:31.857 ++ mktemp /tmp/62.XXX 00:07:31.857 + tmp_file_1=/tmp/62.T6c 00:07:31.857 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:31.857 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:07:31.857 + tmp_file_2=/tmp/spdk_tgt_config.json.iyh 00:07:31.857 + ret=0 00:07:31.857 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:32.117 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:07:32.117 + diff -u /tmp/62.T6c /tmp/spdk_tgt_config.json.iyh 00:07:32.117 + ret=1 00:07:32.117 + echo '=== Start of file: /tmp/62.T6c ===' 00:07:32.117 + cat /tmp/62.T6c 00:07:32.117 + echo '=== End of file: /tmp/62.T6c ===' 00:07:32.117 + echo '' 00:07:32.117 + echo '=== Start of file: /tmp/spdk_tgt_config.json.iyh ===' 00:07:32.117 + cat /tmp/spdk_tgt_config.json.iyh 00:07:32.117 + echo '=== End of file: /tmp/spdk_tgt_config.json.iyh ===' 00:07:32.117 + echo '' 00:07:32.117 + rm /tmp/62.T6c /tmp/spdk_tgt_config.json.iyh 00:07:32.117 + exit 1 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@395 -- # echo 'INFO: configuration change detected.' 00:07:32.117 INFO: configuration change detected. 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@398 -- # json_config_test_fini 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@310 -- # timing_enter json_config_test_fini 00:07:32.117 13:16:12 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:32.117 13:16:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@311 -- # local ret=0 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@313 -- # [[ -n '' ]] 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@321 -- # [[ -n 828612 ]] 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@324 -- # cleanup_bdev_subsystem_config 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@188 -- # timing_enter cleanup_bdev_subsystem_config 00:07:32.117 13:16:12 json_config -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:32.117 13:16:12 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@190 -- # [[ 1 -eq 1 ]] 00:07:32.117 13:16:12 json_config -- json_config/json_config.sh@191 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:07:32.117 13:16:12 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:07:32.383 13:16:13 json_config -- json_config/json_config.sh@192 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:07:32.383 13:16:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:07:32.678 13:16:13 json_config -- json_config/json_config.sh@193 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:07:32.678 13:16:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:07:32.678 13:16:13 json_config -- json_config/json_config.sh@194 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:07:32.678 13:16:13 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:07:32.938 13:16:13 json_config -- json_config/json_config.sh@197 -- # uname -s 00:07:32.938 13:16:13 json_config -- json_config/json_config.sh@197 -- # [[ Linux = Linux ]] 00:07:32.938 13:16:13 json_config -- json_config/json_config.sh@198 -- # rm -f /sample_aio 00:07:32.938 13:16:13 json_config -- json_config/json_config.sh@201 -- # [[ 0 -eq 1 ]] 00:07:32.938 13:16:13 json_config -- json_config/json_config.sh@205 -- # timing_exit cleanup_bdev_subsystem_config 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:32.938 13:16:13 json_config -- json_config/json_config.sh@327 -- # killprocess 828612 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@950 -- # '[' -z 828612 ']' 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@954 -- # kill -0 828612 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@955 -- # uname 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 828612 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@968 -- # echo 'killing process with pid 828612' 00:07:32.938 killing process with pid 828612 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@969 -- # kill 828612 00:07:32.938 13:16:13 json_config -- common/autotest_common.sh@974 -- # wait 828612 00:07:35.480 13:16:16 json_config -- json_config/json_config.sh@330 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:07:35.480 13:16:16 json_config -- json_config/json_config.sh@331 -- # timing_exit json_config_test_fini 00:07:35.480 13:16:16 json_config -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:35.480 13:16:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:35.480 13:16:16 json_config -- json_config/json_config.sh@332 -- # return 0 00:07:35.480 13:16:16 json_config -- json_config/json_config.sh@400 -- # echo 'INFO: Success' 00:07:35.480 INFO: Success 00:07:35.480 00:07:35.480 real 0m28.701s 00:07:35.480 user 0m32.443s 00:07:35.480 sys 0m2.755s 00:07:35.480 13:16:16 json_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:35.480 13:16:16 json_config -- common/autotest_common.sh@10 -- # set +x 00:07:35.480 ************************************ 00:07:35.480 END TEST json_config 00:07:35.480 ************************************ 00:07:35.480 13:16:16 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:35.480 13:16:16 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:35.480 13:16:16 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:35.480 13:16:16 -- common/autotest_common.sh@10 -- # set +x 00:07:35.742 ************************************ 00:07:35.742 START TEST json_config_extra_key 00:07:35.742 ************************************ 00:07:35.742 13:16:16 json_config_extra_key -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:07:35.742 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:07:35.742 13:16:16 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:07:35.742 13:16:16 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:07:35.742 13:16:16 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:07:35.742 13:16:16 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:35.742 13:16:16 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:35.742 13:16:16 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:35.742 13:16:16 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:07:35.742 13:16:16 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:07:35.742 13:16:16 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:07:35.743 INFO: launching applications... 00:07:35.743 13:16:16 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=830542 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:07:35.743 Waiting for target to run... 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 830542 /var/tmp/spdk_tgt.sock 00:07:35.743 13:16:16 json_config_extra_key -- common/autotest_common.sh@831 -- # '[' -z 830542 ']' 00:07:35.743 13:16:16 json_config_extra_key -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:07:35.743 13:16:16 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:07:35.743 13:16:16 json_config_extra_key -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:35.743 13:16:16 json_config_extra_key -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:07:35.743 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:07:35.743 13:16:16 json_config_extra_key -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:35.743 13:16:16 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:35.743 [2024-07-25 13:16:16.483677] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:35.743 [2024-07-25 13:16:16.483747] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid830542 ] 00:07:36.003 [2024-07-25 13:16:16.772413] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:36.263 [2024-07-25 13:16:16.821375] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:36.523 13:16:17 json_config_extra_key -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:36.523 13:16:17 json_config_extra_key -- common/autotest_common.sh@864 -- # return 0 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:07:36.523 00:07:36.523 13:16:17 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:07:36.523 INFO: shutting down applications... 00:07:36.523 13:16:17 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 830542 ]] 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 830542 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 830542 00:07:36.523 13:16:17 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:07:37.095 13:16:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:07:37.095 13:16:17 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:07:37.095 13:16:17 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 830542 00:07:37.095 13:16:17 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:07:37.095 13:16:17 json_config_extra_key -- json_config/common.sh@43 -- # break 00:07:37.095 13:16:17 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:07:37.095 13:16:17 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:07:37.095 SPDK target shutdown done 00:07:37.095 13:16:17 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:07:37.095 Success 00:07:37.095 00:07:37.095 real 0m1.511s 00:07:37.095 user 0m1.117s 00:07:37.095 sys 0m0.385s 00:07:37.095 13:16:17 json_config_extra_key -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:37.095 13:16:17 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:07:37.095 ************************************ 00:07:37.095 END TEST json_config_extra_key 00:07:37.095 ************************************ 00:07:37.095 13:16:17 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:37.095 13:16:17 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:37.095 13:16:17 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:37.095 13:16:17 -- common/autotest_common.sh@10 -- # set +x 00:07:37.356 ************************************ 00:07:37.356 START TEST alias_rpc 00:07:37.356 ************************************ 00:07:37.356 13:16:17 alias_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:07:37.356 * Looking for test storage... 00:07:37.356 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:07:37.356 13:16:17 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:37.356 13:16:17 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=830890 00:07:37.356 13:16:17 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 830890 00:07:37.356 13:16:17 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:37.356 13:16:17 alias_rpc -- common/autotest_common.sh@831 -- # '[' -z 830890 ']' 00:07:37.356 13:16:17 alias_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:37.356 13:16:17 alias_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:37.356 13:16:17 alias_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:37.356 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:37.356 13:16:17 alias_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:37.356 13:16:17 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:37.356 [2024-07-25 13:16:18.051670] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:37.356 [2024-07-25 13:16:18.051724] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid830890 ] 00:07:37.356 [2024-07-25 13:16:18.138972] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:37.617 [2024-07-25 13:16:18.202080] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:38.188 13:16:18 alias_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:38.188 13:16:18 alias_rpc -- common/autotest_common.sh@864 -- # return 0 00:07:38.188 13:16:18 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:07:38.448 13:16:19 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 830890 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@950 -- # '[' -z 830890 ']' 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@954 -- # kill -0 830890 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@955 -- # uname 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 830890 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 830890' 00:07:38.448 killing process with pid 830890 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@969 -- # kill 830890 00:07:38.448 13:16:19 alias_rpc -- common/autotest_common.sh@974 -- # wait 830890 00:07:38.708 00:07:38.708 real 0m1.461s 00:07:38.708 user 0m1.685s 00:07:38.708 sys 0m0.368s 00:07:38.708 13:16:19 alias_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:38.708 13:16:19 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:38.708 ************************************ 00:07:38.708 END TEST alias_rpc 00:07:38.708 ************************************ 00:07:38.708 13:16:19 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:07:38.708 13:16:19 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:38.708 13:16:19 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:38.708 13:16:19 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:38.708 13:16:19 -- common/autotest_common.sh@10 -- # set +x 00:07:38.708 ************************************ 00:07:38.708 START TEST spdkcli_tcp 00:07:38.708 ************************************ 00:07:38.708 13:16:19 spdkcli_tcp -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:07:38.968 * Looking for test storage... 00:07:38.968 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@724 -- # xtrace_disable 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=831249 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 831249 00:07:38.968 13:16:19 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@831 -- # '[' -z 831249 ']' 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:38.968 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:38.968 13:16:19 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:38.968 [2024-07-25 13:16:19.597590] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:38.968 [2024-07-25 13:16:19.597650] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid831249 ] 00:07:38.968 [2024-07-25 13:16:19.688765] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:38.968 [2024-07-25 13:16:19.757850] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:38.968 [2024-07-25 13:16:19.757855] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:39.907 13:16:20 spdkcli_tcp -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:39.907 13:16:20 spdkcli_tcp -- common/autotest_common.sh@864 -- # return 0 00:07:39.907 13:16:20 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=831283 00:07:39.907 13:16:20 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:07:39.907 13:16:20 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:07:39.907 [ 00:07:39.907 "bdev_malloc_delete", 00:07:39.907 "bdev_malloc_create", 00:07:39.907 "bdev_null_resize", 00:07:39.907 "bdev_null_delete", 00:07:39.907 "bdev_null_create", 00:07:39.907 "bdev_nvme_cuse_unregister", 00:07:39.907 "bdev_nvme_cuse_register", 00:07:39.907 "bdev_opal_new_user", 00:07:39.907 "bdev_opal_set_lock_state", 00:07:39.907 "bdev_opal_delete", 00:07:39.907 "bdev_opal_get_info", 00:07:39.907 "bdev_opal_create", 00:07:39.907 "bdev_nvme_opal_revert", 00:07:39.907 "bdev_nvme_opal_init", 00:07:39.907 "bdev_nvme_send_cmd", 00:07:39.907 "bdev_nvme_get_path_iostat", 00:07:39.907 "bdev_nvme_get_mdns_discovery_info", 00:07:39.907 "bdev_nvme_stop_mdns_discovery", 00:07:39.907 "bdev_nvme_start_mdns_discovery", 00:07:39.907 "bdev_nvme_set_multipath_policy", 00:07:39.907 "bdev_nvme_set_preferred_path", 00:07:39.907 "bdev_nvme_get_io_paths", 00:07:39.907 "bdev_nvme_remove_error_injection", 00:07:39.907 "bdev_nvme_add_error_injection", 00:07:39.907 "bdev_nvme_get_discovery_info", 00:07:39.907 "bdev_nvme_stop_discovery", 00:07:39.907 "bdev_nvme_start_discovery", 00:07:39.907 "bdev_nvme_get_controller_health_info", 00:07:39.907 "bdev_nvme_disable_controller", 00:07:39.907 "bdev_nvme_enable_controller", 00:07:39.907 "bdev_nvme_reset_controller", 00:07:39.907 "bdev_nvme_get_transport_statistics", 00:07:39.907 "bdev_nvme_apply_firmware", 00:07:39.907 "bdev_nvme_detach_controller", 00:07:39.907 "bdev_nvme_get_controllers", 00:07:39.908 "bdev_nvme_attach_controller", 00:07:39.908 "bdev_nvme_set_hotplug", 00:07:39.908 "bdev_nvme_set_options", 00:07:39.908 "bdev_passthru_delete", 00:07:39.908 "bdev_passthru_create", 00:07:39.908 "bdev_lvol_set_parent_bdev", 00:07:39.908 "bdev_lvol_set_parent", 00:07:39.908 "bdev_lvol_check_shallow_copy", 00:07:39.908 "bdev_lvol_start_shallow_copy", 00:07:39.908 "bdev_lvol_grow_lvstore", 00:07:39.908 "bdev_lvol_get_lvols", 00:07:39.908 "bdev_lvol_get_lvstores", 00:07:39.908 "bdev_lvol_delete", 00:07:39.908 "bdev_lvol_set_read_only", 00:07:39.908 "bdev_lvol_resize", 00:07:39.908 "bdev_lvol_decouple_parent", 00:07:39.908 "bdev_lvol_inflate", 00:07:39.908 "bdev_lvol_rename", 00:07:39.908 "bdev_lvol_clone_bdev", 00:07:39.908 "bdev_lvol_clone", 00:07:39.908 "bdev_lvol_snapshot", 00:07:39.908 "bdev_lvol_create", 00:07:39.908 "bdev_lvol_delete_lvstore", 00:07:39.908 "bdev_lvol_rename_lvstore", 00:07:39.908 "bdev_lvol_create_lvstore", 00:07:39.908 "bdev_raid_set_options", 00:07:39.908 "bdev_raid_remove_base_bdev", 00:07:39.908 "bdev_raid_add_base_bdev", 00:07:39.908 "bdev_raid_delete", 00:07:39.908 "bdev_raid_create", 00:07:39.908 "bdev_raid_get_bdevs", 00:07:39.908 "bdev_error_inject_error", 00:07:39.908 "bdev_error_delete", 00:07:39.908 "bdev_error_create", 00:07:39.908 "bdev_split_delete", 00:07:39.908 "bdev_split_create", 00:07:39.908 "bdev_delay_delete", 00:07:39.908 "bdev_delay_create", 00:07:39.908 "bdev_delay_update_latency", 00:07:39.908 "bdev_zone_block_delete", 00:07:39.908 "bdev_zone_block_create", 00:07:39.908 "blobfs_create", 00:07:39.908 "blobfs_detect", 00:07:39.908 "blobfs_set_cache_size", 00:07:39.908 "bdev_crypto_delete", 00:07:39.908 "bdev_crypto_create", 00:07:39.908 "bdev_compress_delete", 00:07:39.908 "bdev_compress_create", 00:07:39.908 "bdev_compress_get_orphans", 00:07:39.908 "bdev_aio_delete", 00:07:39.908 "bdev_aio_rescan", 00:07:39.908 "bdev_aio_create", 00:07:39.908 "bdev_ftl_set_property", 00:07:39.908 "bdev_ftl_get_properties", 00:07:39.908 "bdev_ftl_get_stats", 00:07:39.908 "bdev_ftl_unmap", 00:07:39.908 "bdev_ftl_unload", 00:07:39.908 "bdev_ftl_delete", 00:07:39.908 "bdev_ftl_load", 00:07:39.908 "bdev_ftl_create", 00:07:39.908 "bdev_virtio_attach_controller", 00:07:39.908 "bdev_virtio_scsi_get_devices", 00:07:39.908 "bdev_virtio_detach_controller", 00:07:39.908 "bdev_virtio_blk_set_hotplug", 00:07:39.908 "bdev_iscsi_delete", 00:07:39.908 "bdev_iscsi_create", 00:07:39.908 "bdev_iscsi_set_options", 00:07:39.908 "accel_error_inject_error", 00:07:39.908 "ioat_scan_accel_module", 00:07:39.908 "dsa_scan_accel_module", 00:07:39.908 "iaa_scan_accel_module", 00:07:39.908 "dpdk_cryptodev_get_driver", 00:07:39.908 "dpdk_cryptodev_set_driver", 00:07:39.908 "dpdk_cryptodev_scan_accel_module", 00:07:39.908 "compressdev_scan_accel_module", 00:07:39.908 "keyring_file_remove_key", 00:07:39.908 "keyring_file_add_key", 00:07:39.908 "keyring_linux_set_options", 00:07:39.908 "iscsi_get_histogram", 00:07:39.908 "iscsi_enable_histogram", 00:07:39.908 "iscsi_set_options", 00:07:39.908 "iscsi_get_auth_groups", 00:07:39.908 "iscsi_auth_group_remove_secret", 00:07:39.908 "iscsi_auth_group_add_secret", 00:07:39.908 "iscsi_delete_auth_group", 00:07:39.908 "iscsi_create_auth_group", 00:07:39.908 "iscsi_set_discovery_auth", 00:07:39.908 "iscsi_get_options", 00:07:39.908 "iscsi_target_node_request_logout", 00:07:39.908 "iscsi_target_node_set_redirect", 00:07:39.908 "iscsi_target_node_set_auth", 00:07:39.908 "iscsi_target_node_add_lun", 00:07:39.908 "iscsi_get_stats", 00:07:39.908 "iscsi_get_connections", 00:07:39.908 "iscsi_portal_group_set_auth", 00:07:39.908 "iscsi_start_portal_group", 00:07:39.908 "iscsi_delete_portal_group", 00:07:39.908 "iscsi_create_portal_group", 00:07:39.908 "iscsi_get_portal_groups", 00:07:39.908 "iscsi_delete_target_node", 00:07:39.908 "iscsi_target_node_remove_pg_ig_maps", 00:07:39.908 "iscsi_target_node_add_pg_ig_maps", 00:07:39.908 "iscsi_create_target_node", 00:07:39.908 "iscsi_get_target_nodes", 00:07:39.908 "iscsi_delete_initiator_group", 00:07:39.908 "iscsi_initiator_group_remove_initiators", 00:07:39.908 "iscsi_initiator_group_add_initiators", 00:07:39.908 "iscsi_create_initiator_group", 00:07:39.908 "iscsi_get_initiator_groups", 00:07:39.908 "nvmf_set_crdt", 00:07:39.908 "nvmf_set_config", 00:07:39.908 "nvmf_set_max_subsystems", 00:07:39.908 "nvmf_stop_mdns_prr", 00:07:39.908 "nvmf_publish_mdns_prr", 00:07:39.908 "nvmf_subsystem_get_listeners", 00:07:39.908 "nvmf_subsystem_get_qpairs", 00:07:39.908 "nvmf_subsystem_get_controllers", 00:07:39.908 "nvmf_get_stats", 00:07:39.908 "nvmf_get_transports", 00:07:39.908 "nvmf_create_transport", 00:07:39.908 "nvmf_get_targets", 00:07:39.908 "nvmf_delete_target", 00:07:39.908 "nvmf_create_target", 00:07:39.908 "nvmf_subsystem_allow_any_host", 00:07:39.908 "nvmf_subsystem_remove_host", 00:07:39.908 "nvmf_subsystem_add_host", 00:07:39.908 "nvmf_ns_remove_host", 00:07:39.908 "nvmf_ns_add_host", 00:07:39.908 "nvmf_subsystem_remove_ns", 00:07:39.908 "nvmf_subsystem_add_ns", 00:07:39.908 "nvmf_subsystem_listener_set_ana_state", 00:07:39.908 "nvmf_discovery_get_referrals", 00:07:39.908 "nvmf_discovery_remove_referral", 00:07:39.908 "nvmf_discovery_add_referral", 00:07:39.908 "nvmf_subsystem_remove_listener", 00:07:39.908 "nvmf_subsystem_add_listener", 00:07:39.908 "nvmf_delete_subsystem", 00:07:39.908 "nvmf_create_subsystem", 00:07:39.908 "nvmf_get_subsystems", 00:07:39.908 "env_dpdk_get_mem_stats", 00:07:39.908 "nbd_get_disks", 00:07:39.908 "nbd_stop_disk", 00:07:39.908 "nbd_start_disk", 00:07:39.908 "ublk_recover_disk", 00:07:39.908 "ublk_get_disks", 00:07:39.908 "ublk_stop_disk", 00:07:39.908 "ublk_start_disk", 00:07:39.908 "ublk_destroy_target", 00:07:39.908 "ublk_create_target", 00:07:39.908 "virtio_blk_create_transport", 00:07:39.909 "virtio_blk_get_transports", 00:07:39.909 "vhost_controller_set_coalescing", 00:07:39.909 "vhost_get_controllers", 00:07:39.909 "vhost_delete_controller", 00:07:39.909 "vhost_create_blk_controller", 00:07:39.909 "vhost_scsi_controller_remove_target", 00:07:39.909 "vhost_scsi_controller_add_target", 00:07:39.909 "vhost_start_scsi_controller", 00:07:39.909 "vhost_create_scsi_controller", 00:07:39.909 "thread_set_cpumask", 00:07:39.909 "framework_get_governor", 00:07:39.909 "framework_get_scheduler", 00:07:39.909 "framework_set_scheduler", 00:07:39.909 "framework_get_reactors", 00:07:39.909 "thread_get_io_channels", 00:07:39.909 "thread_get_pollers", 00:07:39.909 "thread_get_stats", 00:07:39.909 "framework_monitor_context_switch", 00:07:39.909 "spdk_kill_instance", 00:07:39.909 "log_enable_timestamps", 00:07:39.909 "log_get_flags", 00:07:39.909 "log_clear_flag", 00:07:39.909 "log_set_flag", 00:07:39.909 "log_get_level", 00:07:39.909 "log_set_level", 00:07:39.909 "log_get_print_level", 00:07:39.909 "log_set_print_level", 00:07:39.909 "framework_enable_cpumask_locks", 00:07:39.909 "framework_disable_cpumask_locks", 00:07:39.909 "framework_wait_init", 00:07:39.909 "framework_start_init", 00:07:39.909 "scsi_get_devices", 00:07:39.909 "bdev_get_histogram", 00:07:39.909 "bdev_enable_histogram", 00:07:39.909 "bdev_set_qos_limit", 00:07:39.909 "bdev_set_qd_sampling_period", 00:07:39.909 "bdev_get_bdevs", 00:07:39.909 "bdev_reset_iostat", 00:07:39.909 "bdev_get_iostat", 00:07:39.909 "bdev_examine", 00:07:39.909 "bdev_wait_for_examine", 00:07:39.909 "bdev_set_options", 00:07:39.909 "notify_get_notifications", 00:07:39.909 "notify_get_types", 00:07:39.909 "accel_get_stats", 00:07:39.909 "accel_set_options", 00:07:39.909 "accel_set_driver", 00:07:39.909 "accel_crypto_key_destroy", 00:07:39.909 "accel_crypto_keys_get", 00:07:39.909 "accel_crypto_key_create", 00:07:39.909 "accel_assign_opc", 00:07:39.909 "accel_get_module_info", 00:07:39.909 "accel_get_opc_assignments", 00:07:39.909 "vmd_rescan", 00:07:39.909 "vmd_remove_device", 00:07:39.909 "vmd_enable", 00:07:39.909 "sock_get_default_impl", 00:07:39.909 "sock_set_default_impl", 00:07:39.909 "sock_impl_set_options", 00:07:39.909 "sock_impl_get_options", 00:07:39.909 "iobuf_get_stats", 00:07:39.909 "iobuf_set_options", 00:07:39.909 "framework_get_pci_devices", 00:07:39.909 "framework_get_config", 00:07:39.909 "framework_get_subsystems", 00:07:39.909 "trace_get_info", 00:07:39.909 "trace_get_tpoint_group_mask", 00:07:39.909 "trace_disable_tpoint_group", 00:07:39.909 "trace_enable_tpoint_group", 00:07:39.909 "trace_clear_tpoint_mask", 00:07:39.909 "trace_set_tpoint_mask", 00:07:39.909 "keyring_get_keys", 00:07:39.909 "spdk_get_version", 00:07:39.909 "rpc_get_methods" 00:07:39.909 ] 00:07:39.909 13:16:20 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:07:39.909 13:16:20 spdkcli_tcp -- common/autotest_common.sh@730 -- # xtrace_disable 00:07:39.909 13:16:20 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:39.909 13:16:20 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:07:39.909 13:16:20 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 831249 00:07:39.909 13:16:20 spdkcli_tcp -- common/autotest_common.sh@950 -- # '[' -z 831249 ']' 00:07:39.909 13:16:20 spdkcli_tcp -- common/autotest_common.sh@954 -- # kill -0 831249 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@955 -- # uname 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 831249 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@968 -- # echo 'killing process with pid 831249' 00:07:40.169 killing process with pid 831249 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@969 -- # kill 831249 00:07:40.169 13:16:20 spdkcli_tcp -- common/autotest_common.sh@974 -- # wait 831249 00:07:40.430 00:07:40.430 real 0m1.535s 00:07:40.430 user 0m2.909s 00:07:40.430 sys 0m0.445s 00:07:40.430 13:16:20 spdkcli_tcp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:40.430 13:16:20 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:07:40.430 ************************************ 00:07:40.430 END TEST spdkcli_tcp 00:07:40.430 ************************************ 00:07:40.430 13:16:21 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:40.430 13:16:21 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:40.430 13:16:21 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:40.430 13:16:21 -- common/autotest_common.sh@10 -- # set +x 00:07:40.430 ************************************ 00:07:40.430 START TEST dpdk_mem_utility 00:07:40.430 ************************************ 00:07:40.430 13:16:21 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:07:40.430 * Looking for test storage... 00:07:40.430 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:07:40.430 13:16:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:40.430 13:16:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=831623 00:07:40.430 13:16:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 831623 00:07:40.430 13:16:21 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:07:40.430 13:16:21 dpdk_mem_utility -- common/autotest_common.sh@831 -- # '[' -z 831623 ']' 00:07:40.430 13:16:21 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:40.430 13:16:21 dpdk_mem_utility -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:40.430 13:16:21 dpdk_mem_utility -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:40.430 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:40.430 13:16:21 dpdk_mem_utility -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:40.430 13:16:21 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:40.430 [2024-07-25 13:16:21.212858] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:40.430 [2024-07-25 13:16:21.212919] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid831623 ] 00:07:40.690 [2024-07-25 13:16:21.306530] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:40.690 [2024-07-25 13:16:21.374018] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:41.259 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:41.259 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@864 -- # return 0 00:07:41.259 13:16:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:07:41.259 13:16:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:07:41.259 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:41.259 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:41.259 { 00:07:41.259 "filename": "/tmp/spdk_mem_dump.txt" 00:07:41.259 } 00:07:41.259 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:41.259 13:16:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:07:41.523 DPDK memory size 816.000000 MiB in 2 heap(s) 00:07:41.523 2 heaps totaling size 816.000000 MiB 00:07:41.523 size: 814.000000 MiB heap id: 0 00:07:41.523 size: 2.000000 MiB heap id: 1 00:07:41.523 end heaps---------- 00:07:41.523 8 mempools totaling size 598.116089 MiB 00:07:41.523 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:07:41.523 size: 158.602051 MiB name: PDU_data_out_Pool 00:07:41.523 size: 84.521057 MiB name: bdev_io_831623 00:07:41.523 size: 51.011292 MiB name: evtpool_831623 00:07:41.523 size: 50.003479 MiB name: msgpool_831623 00:07:41.523 size: 21.763794 MiB name: PDU_Pool 00:07:41.523 size: 19.513306 MiB name: SCSI_TASK_Pool 00:07:41.523 size: 0.026123 MiB name: Session_Pool 00:07:41.523 end mempools------- 00:07:41.523 201 memzones totaling size 4.176453 MiB 00:07:41.523 size: 1.000366 MiB name: RG_ring_0_831623 00:07:41.523 size: 1.000366 MiB name: RG_ring_1_831623 00:07:41.523 size: 1.000366 MiB name: RG_ring_4_831623 00:07:41.523 size: 1.000366 MiB name: RG_ring_5_831623 00:07:41.523 size: 0.125366 MiB name: RG_ring_2_831623 00:07:41.523 size: 0.015991 MiB name: RG_ring_3_831623 00:07:41.523 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.0_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.1_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.2_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.3_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.4_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.5_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.6_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:01.7_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.0_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.1_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.2_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.3_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.4_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.5_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.6_qat 00:07:41.523 size: 0.000305 MiB name: 0000:cc:02.7_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.0_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.1_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.2_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.3_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.4_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.5_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.6_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:01.7_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.0_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.1_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.2_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.3_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.4_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.5_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.6_qat 00:07:41.523 size: 0.000305 MiB name: 0000:ce:02.7_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.0_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.1_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.2_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.3_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.4_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.5_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.6_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:01.7_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.0_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.1_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.2_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.3_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.4_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.5_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.6_qat 00:07:41.523 size: 0.000305 MiB name: 0000:d0:02.7_qat 00:07:41.523 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_0 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_1 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_0 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_2 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_3 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_1 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_4 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_5 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_2 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_6 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_7 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_3 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_8 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_9 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_4 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_10 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_11 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_5 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_12 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_13 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_6 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_14 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_15 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_7 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_16 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_17 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_8 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_18 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_19 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_9 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_20 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_21 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_10 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_22 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_23 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_11 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_24 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_25 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_12 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_26 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_27 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_13 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_28 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_29 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_14 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_30 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_31 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_15 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_32 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_33 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_16 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_34 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_35 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_17 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_36 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_37 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_18 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_38 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_39 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_19 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_40 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_41 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_20 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_42 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_43 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_21 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_44 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_45 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_22 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_46 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_47 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_23 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_48 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_49 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_24 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_50 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_51 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_25 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_52 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_53 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_26 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_54 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_55 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_27 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_56 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_57 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_28 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_58 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_59 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_29 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_60 00:07:41.523 size: 0.000122 MiB name: rte_cryptodev_data_61 00:07:41.523 size: 0.000122 MiB name: rte_compressdev_data_30 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_62 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_63 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_31 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_64 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_65 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_32 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_66 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_67 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_33 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_68 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_69 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_34 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_70 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_71 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_35 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_72 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_73 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_36 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_74 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_75 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_37 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_76 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_77 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_38 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_78 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_79 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_39 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_80 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_81 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_40 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_82 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_83 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_41 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_84 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_85 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_42 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_86 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_87 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_43 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_88 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_89 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_44 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_90 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_91 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_45 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_92 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_93 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_46 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_94 00:07:41.524 size: 0.000122 MiB name: rte_cryptodev_data_95 00:07:41.524 size: 0.000122 MiB name: rte_compressdev_data_47 00:07:41.524 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:41.524 end memzones------- 00:07:41.524 13:16:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:07:41.524 heap id: 0 total size: 814.000000 MiB number of busy elements: 493 number of free elements: 14 00:07:41.524 list of free elements. size: 11.842896 MiB 00:07:41.524 element at address: 0x200000400000 with size: 1.999512 MiB 00:07:41.524 element at address: 0x200018e00000 with size: 0.999878 MiB 00:07:41.524 element at address: 0x200019000000 with size: 0.999878 MiB 00:07:41.524 element at address: 0x200003e00000 with size: 0.996460 MiB 00:07:41.524 element at address: 0x200031c00000 with size: 0.994446 MiB 00:07:41.524 element at address: 0x200007000000 with size: 0.991760 MiB 00:07:41.524 element at address: 0x200013800000 with size: 0.978882 MiB 00:07:41.524 element at address: 0x200019200000 with size: 0.937256 MiB 00:07:41.524 element at address: 0x20001aa00000 with size: 0.583252 MiB 00:07:41.524 element at address: 0x200003a00000 with size: 0.498535 MiB 00:07:41.524 element at address: 0x20000b200000 with size: 0.491272 MiB 00:07:41.524 element at address: 0x200000800000 with size: 0.486145 MiB 00:07:41.524 element at address: 0x200019400000 with size: 0.485840 MiB 00:07:41.524 element at address: 0x200027e00000 with size: 0.399780 MiB 00:07:41.524 list of standard malloc elements. size: 199.872253 MiB 00:07:41.524 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:07:41.524 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:07:41.524 element at address: 0x200018efff80 with size: 1.000122 MiB 00:07:41.524 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:07:41.524 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:07:41.524 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:07:41.524 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:07:41.524 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:07:41.524 element at address: 0x20000033b340 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000033e8c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000341e40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003453c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000348940 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000034bec0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000034f440 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003529c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000355f40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003594c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000035ca40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000035ffc0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000363540 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000366ac0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000036a040 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000036d5c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000370b40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003740c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000377640 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000037abc0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000037e140 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003816c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000384c40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003881c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000038b740 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000038ecc0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000392240 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003957c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000398d40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000039c2c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x20000039f840 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003a2dc0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003a6340 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003a98c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003ace40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003b03c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003b3940 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003b6ec0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003ba440 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003bd9c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003c0f40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003c44c0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003c7a40 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003cafc0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003ce540 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003d1ac0 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003d5040 with size: 0.004395 MiB 00:07:41.524 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:07:41.524 element at address: 0x200000339240 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000033a2c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000033c7c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000033d840 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000033fd40 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000340dc0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x2000003432c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000344340 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000346840 with size: 0.004028 MiB 00:07:41.524 element at address: 0x2000003478c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000349dc0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000034ae40 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000034d340 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000034e3c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x2000003508c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000351940 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000353e40 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000354ec0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x2000003573c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000358440 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000035a940 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000035b9c0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000035dec0 with size: 0.004028 MiB 00:07:41.524 element at address: 0x20000035ef40 with size: 0.004028 MiB 00:07:41.524 element at address: 0x200000361440 with size: 0.004028 MiB 00:07:41.524 element at address: 0x2000003624c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003649c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000365a40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000367f40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000368fc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000036b4c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000036c540 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000036ea40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000036fac0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000371fc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000373040 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000375540 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003765c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000378ac0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000379b40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000037c040 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000037d0c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000037f5c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000380640 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000382b40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000383bc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003860c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000387140 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000389640 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000038a6c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000038cbc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000038dc40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000390140 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003911c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003936c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000394740 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000396c40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000397cc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000039a1c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000039b240 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000039d740 with size: 0.004028 MiB 00:07:41.525 element at address: 0x20000039e7c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003a0cc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003a1d40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003a4240 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003a52c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003a77c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003a8840 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003aad40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003abdc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003ae2c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003af340 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003b1840 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003b28c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003b4dc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003b5e40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003b8340 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003b93c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003bb8c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003bc940 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003bee40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003bfec0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003c23c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003c3440 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003c5940 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003c69c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003c8ec0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003c9f40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003cc440 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003cd4c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003cf9c0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003d0a40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003d2f40 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003d3fc0 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:07:41.525 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:07:41.525 element at address: 0x200000200000 with size: 0.000305 MiB 00:07:41.525 element at address: 0x20000020e940 with size: 0.000305 MiB 00:07:41.525 element at address: 0x200000200140 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200200 with size: 0.000183 MiB 00:07:41.525 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200380 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200440 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200500 with size: 0.000183 MiB 00:07:41.525 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200680 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200740 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200800 with size: 0.000183 MiB 00:07:41.525 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200980 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200a40 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200b00 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200c80 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200d40 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200e00 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000200ec0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209180 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209240 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209300 with size: 0.000183 MiB 00:07:41.525 element at address: 0x2000002093c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209480 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209540 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209600 with size: 0.000183 MiB 00:07:41.525 element at address: 0x2000002096c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209780 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209840 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209900 with size: 0.000183 MiB 00:07:41.525 element at address: 0x2000002099c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209a80 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209b40 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209c00 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209cc0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209d80 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209e40 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209f00 with size: 0.000183 MiB 00:07:41.525 element at address: 0x200000209fc0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a080 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a140 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a200 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a2c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a380 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a440 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a500 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a5c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a680 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a740 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a800 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a8c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020a980 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020aa40 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020ab00 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020abc0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020ac80 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020ad40 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020ae00 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020aec0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020af80 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b040 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b100 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b1c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b280 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b340 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b400 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b4c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b580 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b640 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b700 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b7c0 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b880 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020b940 with size: 0.000183 MiB 00:07:41.525 element at address: 0x20000020ba00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020bac0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020bb80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020bc40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020bd00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020bdc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020be80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020bf40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c000 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c0c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c180 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c240 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c300 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c3c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c480 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c540 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c600 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c6c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c780 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c840 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c900 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020c9c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ca80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020cb40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020cc00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ccc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020cd80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ce40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020cf00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020cfc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d080 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d140 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d200 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d2c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d380 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d440 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d500 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d5c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d680 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d740 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d800 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d8c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020d980 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020da40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020db00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020dbc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020dc80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020dd40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020de00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020dec0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020df80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e040 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e100 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e1c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e280 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e340 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e400 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e4c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e580 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e640 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e700 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e7c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020e880 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ea80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020eb40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ec00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ecc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ed80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ee40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ef00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020efc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f080 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f140 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f200 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f2c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f380 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f440 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f500 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f5c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f680 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f740 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f800 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f8c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020f980 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020fa40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020fb00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020fbc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020fc80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020fd40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020fe00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020fec0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000020ff80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210040 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210100 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002101c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210280 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210340 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210400 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002104c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210580 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210640 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210700 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002107c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210880 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210940 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210a00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000210c00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000214ec0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235180 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235240 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235300 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002353c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235480 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235540 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235600 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002356c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235780 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235840 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235900 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002359c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235a80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235b40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235c00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235cc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235d80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235e40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000235f00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236100 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002361c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236280 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236340 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236400 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002364c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236580 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236640 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236700 with size: 0.000183 MiB 00:07:41.526 element at address: 0x2000002367c0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236880 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236940 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236a00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236ac0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236b80 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236c40 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000236d00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000338f00 with size: 0.000183 MiB 00:07:41.526 element at address: 0x200000338fc0 with size: 0.000183 MiB 00:07:41.526 element at address: 0x20000033c540 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000033fac0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000343040 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003465c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000349b40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000034d0c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000350640 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000353bc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000357140 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000035a6c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000035dc40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003611c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000364740 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000367cc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000036b240 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000036e7c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000371d40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003752c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000378840 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000037bdc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000037f340 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003828c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000385e40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003893c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000038c940 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000038fec0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000393440 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003969c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200000399f40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000039d4c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003a0a40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003a3fc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003a7540 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003aaac0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003ae040 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003b4b40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003b80c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003bb640 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003bebc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003c2140 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003c56c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003c8c40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003cc1c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003cf740 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003d2cc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000003d6840 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087c740 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087c800 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087c980 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:07:41.527 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e66580 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e66640 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d240 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:07:41.527 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:07:41.527 list of memzone associated elements. size: 602.284851 MiB 00:07:41.527 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:07:41.527 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:07:41.527 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:07:41.527 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:07:41.527 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:07:41.527 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_831623_0 00:07:41.527 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:07:41.527 associated memzone info: size: 48.002930 MiB name: MP_evtpool_831623_0 00:07:41.527 element at address: 0x200003fff380 with size: 48.003052 MiB 00:07:41.527 associated memzone info: size: 48.002930 MiB name: MP_msgpool_831623_0 00:07:41.527 element at address: 0x2000195be940 with size: 20.255554 MiB 00:07:41.528 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:07:41.528 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:07:41.528 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:07:41.528 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:07:41.528 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_831623 00:07:41.528 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:07:41.528 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_831623 00:07:41.528 element at address: 0x200000236dc0 with size: 1.008118 MiB 00:07:41.528 associated memzone info: size: 1.007996 MiB name: MP_evtpool_831623 00:07:41.528 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:07:41.528 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:07:41.528 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:07:41.528 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:07:41.528 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:07:41.528 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:07:41.528 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:07:41.528 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:07:41.528 element at address: 0x200003eff180 with size: 1.000488 MiB 00:07:41.528 associated memzone info: size: 1.000366 MiB name: RG_ring_0_831623 00:07:41.528 element at address: 0x200003affc00 with size: 1.000488 MiB 00:07:41.528 associated memzone info: size: 1.000366 MiB name: RG_ring_1_831623 00:07:41.528 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:07:41.528 associated memzone info: size: 1.000366 MiB name: RG_ring_4_831623 00:07:41.528 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:07:41.528 associated memzone info: size: 1.000366 MiB name: RG_ring_5_831623 00:07:41.528 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:07:41.528 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_831623 00:07:41.528 element at address: 0x20000b27dc40 with size: 0.500488 MiB 00:07:41.528 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:07:41.528 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:07:41.528 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:07:41.528 element at address: 0x20001947c600 with size: 0.250488 MiB 00:07:41.528 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:07:41.528 element at address: 0x200000214f80 with size: 0.125488 MiB 00:07:41.528 associated memzone info: size: 0.125366 MiB name: RG_ring_2_831623 00:07:41.528 element at address: 0x200000200f80 with size: 0.031738 MiB 00:07:41.528 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:07:41.528 element at address: 0x200027e66700 with size: 0.023743 MiB 00:07:41.528 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:07:41.528 element at address: 0x200000210cc0 with size: 0.016113 MiB 00:07:41.528 associated memzone info: size: 0.015991 MiB name: RG_ring_3_831623 00:07:41.528 element at address: 0x200027e6c840 with size: 0.002441 MiB 00:07:41.528 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:07:41.528 element at address: 0x2000003d6300 with size: 0.001282 MiB 00:07:41.528 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:07:41.528 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.0_qat 00:07:41.528 element at address: 0x2000003d2d80 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.1_qat 00:07:41.528 element at address: 0x2000003cf800 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.2_qat 00:07:41.528 element at address: 0x2000003cc280 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.3_qat 00:07:41.528 element at address: 0x2000003c8d00 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.4_qat 00:07:41.528 element at address: 0x2000003c5780 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.5_qat 00:07:41.528 element at address: 0x2000003c2200 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.6_qat 00:07:41.528 element at address: 0x2000003bec80 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:01.7_qat 00:07:41.528 element at address: 0x2000003bb700 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.0_qat 00:07:41.528 element at address: 0x2000003b8180 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.1_qat 00:07:41.528 element at address: 0x2000003b4c00 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.2_qat 00:07:41.528 element at address: 0x2000003b1680 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.3_qat 00:07:41.528 element at address: 0x2000003ae100 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.4_qat 00:07:41.528 element at address: 0x2000003aab80 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.5_qat 00:07:41.528 element at address: 0x2000003a7600 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.6_qat 00:07:41.528 element at address: 0x2000003a4080 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:cc:02.7_qat 00:07:41.528 element at address: 0x2000003a0b00 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.0_qat 00:07:41.528 element at address: 0x20000039d580 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.1_qat 00:07:41.528 element at address: 0x20000039a000 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.2_qat 00:07:41.528 element at address: 0x200000396a80 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.3_qat 00:07:41.528 element at address: 0x200000393500 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.4_qat 00:07:41.528 element at address: 0x20000038ff80 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.5_qat 00:07:41.528 element at address: 0x20000038ca00 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.6_qat 00:07:41.528 element at address: 0x200000389480 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:01.7_qat 00:07:41.528 element at address: 0x200000385f00 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.0_qat 00:07:41.528 element at address: 0x200000382980 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.1_qat 00:07:41.528 element at address: 0x20000037f400 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.2_qat 00:07:41.528 element at address: 0x20000037be80 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.3_qat 00:07:41.528 element at address: 0x200000378900 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.4_qat 00:07:41.528 element at address: 0x200000375380 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.5_qat 00:07:41.528 element at address: 0x200000371e00 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.6_qat 00:07:41.528 element at address: 0x20000036e880 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:ce:02.7_qat 00:07:41.528 element at address: 0x20000036b300 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.0_qat 00:07:41.528 element at address: 0x200000367d80 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.1_qat 00:07:41.528 element at address: 0x200000364800 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.2_qat 00:07:41.528 element at address: 0x200000361280 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.3_qat 00:07:41.528 element at address: 0x20000035dd00 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.4_qat 00:07:41.528 element at address: 0x20000035a780 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.5_qat 00:07:41.528 element at address: 0x200000357200 with size: 0.000427 MiB 00:07:41.528 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.6_qat 00:07:41.528 element at address: 0x200000353c80 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:01.7_qat 00:07:41.529 element at address: 0x200000350700 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.0_qat 00:07:41.529 element at address: 0x20000034d180 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.1_qat 00:07:41.529 element at address: 0x200000349c00 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.2_qat 00:07:41.529 element at address: 0x200000346680 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.3_qat 00:07:41.529 element at address: 0x200000343100 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.4_qat 00:07:41.529 element at address: 0x20000033fb80 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.5_qat 00:07:41.529 element at address: 0x20000033c600 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.6_qat 00:07:41.529 element at address: 0x200000339080 with size: 0.000427 MiB 00:07:41.529 associated memzone info: size: 0.000305 MiB name: 0000:d0:02.7_qat 00:07:41.529 element at address: 0x2000003d6900 with size: 0.000305 MiB 00:07:41.529 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:07:41.529 element at address: 0x200000235fc0 with size: 0.000305 MiB 00:07:41.529 associated memzone info: size: 0.000183 MiB name: MP_msgpool_831623 00:07:41.529 element at address: 0x200000210ac0 with size: 0.000305 MiB 00:07:41.529 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_831623 00:07:41.529 element at address: 0x200027e6d300 with size: 0.000305 MiB 00:07:41.529 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:07:41.529 element at address: 0x2000003d6240 with size: 0.000183 MiB 00:07:41.529 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:07:41.529 13:16:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:07:41.529 13:16:22 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 831623 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@950 -- # '[' -z 831623 ']' 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@954 -- # kill -0 831623 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@955 -- # uname 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 831623 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@968 -- # echo 'killing process with pid 831623' 00:07:41.529 killing process with pid 831623 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@969 -- # kill 831623 00:07:41.529 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@974 -- # wait 831623 00:07:41.789 00:07:41.789 real 0m1.418s 00:07:41.789 user 0m1.570s 00:07:41.789 sys 0m0.420s 00:07:41.789 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:41.789 13:16:22 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:07:41.789 ************************************ 00:07:41.789 END TEST dpdk_mem_utility 00:07:41.789 ************************************ 00:07:41.789 13:16:22 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:41.789 13:16:22 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:41.789 13:16:22 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:41.789 13:16:22 -- common/autotest_common.sh@10 -- # set +x 00:07:41.789 ************************************ 00:07:41.789 START TEST event 00:07:41.789 ************************************ 00:07:41.789 13:16:22 event -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:07:42.049 * Looking for test storage... 00:07:42.049 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:07:42.049 13:16:22 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:42.049 13:16:22 event -- bdev/nbd_common.sh@6 -- # set -e 00:07:42.049 13:16:22 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:42.049 13:16:22 event -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:07:42.049 13:16:22 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:42.049 13:16:22 event -- common/autotest_common.sh@10 -- # set +x 00:07:42.049 ************************************ 00:07:42.049 START TEST event_perf 00:07:42.049 ************************************ 00:07:42.049 13:16:22 event.event_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:07:42.049 Running I/O for 1 seconds...[2024-07-25 13:16:22.688238] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:42.049 [2024-07-25 13:16:22.688322] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid831850 ] 00:07:42.049 [2024-07-25 13:16:22.783877] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:42.309 [2024-07-25 13:16:22.865318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:42.309 [2024-07-25 13:16:22.865462] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:42.309 [2024-07-25 13:16:22.865600] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:42.309 [2024-07-25 13:16:22.865611] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:43.248 Running I/O for 1 seconds... 00:07:43.248 lcore 0: 80257 00:07:43.248 lcore 1: 80260 00:07:43.248 lcore 2: 80263 00:07:43.248 lcore 3: 80262 00:07:43.248 done. 00:07:43.248 00:07:43.248 real 0m1.256s 00:07:43.248 user 0m4.143s 00:07:43.248 sys 0m0.107s 00:07:43.248 13:16:23 event.event_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:43.248 13:16:23 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:07:43.248 ************************************ 00:07:43.248 END TEST event_perf 00:07:43.248 ************************************ 00:07:43.248 13:16:23 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:43.248 13:16:23 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:43.248 13:16:23 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:43.248 13:16:23 event -- common/autotest_common.sh@10 -- # set +x 00:07:43.248 ************************************ 00:07:43.248 START TEST event_reactor 00:07:43.248 ************************************ 00:07:43.248 13:16:23 event.event_reactor -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:07:43.248 [2024-07-25 13:16:24.023122] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:43.248 [2024-07-25 13:16:24.023188] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid832034 ] 00:07:43.508 [2024-07-25 13:16:24.115610] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:43.508 [2024-07-25 13:16:24.190714] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:44.448 test_start 00:07:44.448 oneshot 00:07:44.448 tick 100 00:07:44.448 tick 100 00:07:44.448 tick 250 00:07:44.448 tick 100 00:07:44.448 tick 100 00:07:44.448 tick 100 00:07:44.448 tick 250 00:07:44.448 tick 500 00:07:44.448 tick 100 00:07:44.448 tick 100 00:07:44.448 tick 250 00:07:44.448 tick 100 00:07:44.448 tick 100 00:07:44.448 test_end 00:07:44.707 00:07:44.707 real 0m1.245s 00:07:44.707 user 0m1.137s 00:07:44.707 sys 0m0.102s 00:07:44.707 13:16:25 event.event_reactor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:44.707 13:16:25 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:07:44.707 ************************************ 00:07:44.707 END TEST event_reactor 00:07:44.708 ************************************ 00:07:44.708 13:16:25 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:44.708 13:16:25 event -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:07:44.708 13:16:25 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:44.708 13:16:25 event -- common/autotest_common.sh@10 -- # set +x 00:07:44.708 ************************************ 00:07:44.708 START TEST event_reactor_perf 00:07:44.708 ************************************ 00:07:44.708 13:16:25 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:07:44.708 [2024-07-25 13:16:25.345429] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:44.708 [2024-07-25 13:16:25.345496] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid832346 ] 00:07:44.708 [2024-07-25 13:16:25.435166] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:44.967 [2024-07-25 13:16:25.505230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:45.905 test_start 00:07:45.905 test_end 00:07:45.905 Performance: 395284 events per second 00:07:45.905 00:07:45.905 real 0m1.237s 00:07:45.905 user 0m1.137s 00:07:45.905 sys 0m0.096s 00:07:45.905 13:16:26 event.event_reactor_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:45.905 13:16:26 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:07:45.905 ************************************ 00:07:45.905 END TEST event_reactor_perf 00:07:45.905 ************************************ 00:07:45.905 13:16:26 event -- event/event.sh@49 -- # uname -s 00:07:45.905 13:16:26 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:07:45.905 13:16:26 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:45.905 13:16:26 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:45.905 13:16:26 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:45.905 13:16:26 event -- common/autotest_common.sh@10 -- # set +x 00:07:45.905 ************************************ 00:07:45.905 START TEST event_scheduler 00:07:45.905 ************************************ 00:07:45.905 13:16:26 event.event_scheduler -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:07:46.166 * Looking for test storage... 00:07:46.166 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:07:46.166 13:16:26 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:07:46.166 13:16:26 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=832688 00:07:46.166 13:16:26 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:07:46.166 13:16:26 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:07:46.166 13:16:26 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 832688 00:07:46.166 13:16:26 event.event_scheduler -- common/autotest_common.sh@831 -- # '[' -z 832688 ']' 00:07:46.166 13:16:26 event.event_scheduler -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:46.166 13:16:26 event.event_scheduler -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:46.166 13:16:26 event.event_scheduler -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:46.166 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:46.166 13:16:26 event.event_scheduler -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:46.166 13:16:26 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:46.166 [2024-07-25 13:16:26.792953] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:46.166 [2024-07-25 13:16:26.793009] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid832688 ] 00:07:46.166 [2024-07-25 13:16:26.939772] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:46.426 [2024-07-25 13:16:27.108090] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:46.426 [2024-07-25 13:16:27.108254] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:46.426 [2024-07-25 13:16:27.108519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:07:46.426 [2024-07-25 13:16:27.108626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:07:46.996 13:16:27 event.event_scheduler -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:46.996 13:16:27 event.event_scheduler -- common/autotest_common.sh@864 -- # return 0 00:07:46.996 13:16:27 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:07:46.996 13:16:27 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:46.996 13:16:27 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:46.996 [2024-07-25 13:16:27.651574] dpdk_governor.c: 173:_init: *ERROR*: App core mask contains some but not all of a set of SMT siblings 00:07:46.996 [2024-07-25 13:16:27.651627] scheduler_dynamic.c: 270:init: *NOTICE*: Unable to initialize dpdk governor 00:07:46.996 [2024-07-25 13:16:27.651653] scheduler_dynamic.c: 416:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:07:46.996 [2024-07-25 13:16:27.651670] scheduler_dynamic.c: 418:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:07:46.997 [2024-07-25 13:16:27.651686] scheduler_dynamic.c: 420:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:07:46.997 13:16:27 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.997 13:16:27 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:07:46.997 13:16:27 event.event_scheduler -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:46.997 13:16:27 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:46.997 [2024-07-25 13:16:27.767046] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:07:46.997 13:16:27 event.event_scheduler -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:46.997 13:16:27 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:07:46.997 13:16:27 event.event_scheduler -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:46.997 13:16:27 event.event_scheduler -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:46.997 13:16:27 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:47.257 ************************************ 00:07:47.257 START TEST scheduler_create_thread 00:07:47.257 ************************************ 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # scheduler_create_thread 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.257 2 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.257 3 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.257 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.258 4 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.258 5 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.258 6 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.258 7 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.258 8 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.258 9 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:47.258 10 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:47.258 13:16:27 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:48.637 13:16:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:48.637 13:16:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:07:48.637 13:16:29 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:07:48.637 13:16:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:48.637 13:16:29 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:49.575 13:16:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:49.575 13:16:30 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:07:49.575 13:16:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:49.575 13:16:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:50.511 13:16:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:50.511 13:16:30 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:07:50.511 13:16:30 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:07:50.511 13:16:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@561 -- # xtrace_disable 00:07:50.511 13:16:30 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:51.079 13:16:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:07:51.079 00:07:51.079 real 0m3.896s 00:07:51.079 user 0m0.024s 00:07:51.079 sys 0m0.007s 00:07:51.079 13:16:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:51.079 13:16:31 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:07:51.079 ************************************ 00:07:51.079 END TEST scheduler_create_thread 00:07:51.079 ************************************ 00:07:51.079 13:16:31 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:07:51.079 13:16:31 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 832688 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@950 -- # '[' -z 832688 ']' 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@954 -- # kill -0 832688 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@955 -- # uname 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 832688 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@956 -- # process_name=reactor_2 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@960 -- # '[' reactor_2 = sudo ']' 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@968 -- # echo 'killing process with pid 832688' 00:07:51.079 killing process with pid 832688 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@969 -- # kill 832688 00:07:51.079 13:16:31 event.event_scheduler -- common/autotest_common.sh@974 -- # wait 832688 00:07:51.339 [2024-07-25 13:16:32.081008] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:07:51.907 00:07:51.907 real 0m5.789s 00:07:51.907 user 0m11.807s 00:07:51.907 sys 0m0.471s 00:07:51.907 13:16:32 event.event_scheduler -- common/autotest_common.sh@1126 -- # xtrace_disable 00:07:51.907 13:16:32 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:07:51.907 ************************************ 00:07:51.907 END TEST event_scheduler 00:07:51.907 ************************************ 00:07:51.907 13:16:32 event -- event/event.sh@51 -- # modprobe -n nbd 00:07:51.907 13:16:32 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:07:51.907 13:16:32 event -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:07:51.907 13:16:32 event -- common/autotest_common.sh@1107 -- # xtrace_disable 00:07:51.907 13:16:32 event -- common/autotest_common.sh@10 -- # set +x 00:07:51.907 ************************************ 00:07:51.907 START TEST app_repeat 00:07:51.907 ************************************ 00:07:51.907 13:16:32 event.app_repeat -- common/autotest_common.sh@1125 -- # app_repeat_test 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@19 -- # repeat_pid=833660 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 833660' 00:07:51.907 Process app_repeat pid: 833660 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:07:51.907 spdk_app_start Round 0 00:07:51.907 13:16:32 event.app_repeat -- event/event.sh@25 -- # waitforlisten 833660 /var/tmp/spdk-nbd.sock 00:07:51.907 13:16:32 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 833660 ']' 00:07:51.907 13:16:32 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:51.907 13:16:32 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:51.907 13:16:32 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:51.907 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:51.907 13:16:32 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:51.907 13:16:32 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:51.907 [2024-07-25 13:16:32.548956] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:07:51.907 [2024-07-25 13:16:32.549003] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid833660 ] 00:07:51.907 [2024-07-25 13:16:32.634270] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:51.907 [2024-07-25 13:16:32.698153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:51.907 [2024-07-25 13:16:32.698158] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:52.841 13:16:33 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:52.841 13:16:33 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:52.841 13:16:33 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:52.841 Malloc0 00:07:53.100 13:16:33 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:53.100 Malloc1 00:07:53.100 13:16:33 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:53.100 13:16:33 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:53.360 /dev/nbd0 00:07:53.360 13:16:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:53.360 13:16:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:53.360 1+0 records in 00:07:53.360 1+0 records out 00:07:53.360 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000203337 s, 20.1 MB/s 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:53.360 13:16:34 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:53.360 13:16:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:53.360 13:16:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:53.360 13:16:34 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:53.619 /dev/nbd1 00:07:53.619 13:16:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:53.619 13:16:34 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:53.619 1+0 records in 00:07:53.619 1+0 records out 00:07:53.619 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000204349 s, 20.0 MB/s 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:53.619 13:16:34 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:53.619 13:16:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:53.619 13:16:34 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:53.619 13:16:34 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:53.619 13:16:34 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:53.619 13:16:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:53.879 { 00:07:53.879 "nbd_device": "/dev/nbd0", 00:07:53.879 "bdev_name": "Malloc0" 00:07:53.879 }, 00:07:53.879 { 00:07:53.879 "nbd_device": "/dev/nbd1", 00:07:53.879 "bdev_name": "Malloc1" 00:07:53.879 } 00:07:53.879 ]' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:53.879 { 00:07:53.879 "nbd_device": "/dev/nbd0", 00:07:53.879 "bdev_name": "Malloc0" 00:07:53.879 }, 00:07:53.879 { 00:07:53.879 "nbd_device": "/dev/nbd1", 00:07:53.879 "bdev_name": "Malloc1" 00:07:53.879 } 00:07:53.879 ]' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:53.879 /dev/nbd1' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:53.879 /dev/nbd1' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:53.879 256+0 records in 00:07:53.879 256+0 records out 00:07:53.879 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125815 s, 83.3 MB/s 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:53.879 256+0 records in 00:07:53.879 256+0 records out 00:07:53.879 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0142181 s, 73.7 MB/s 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:53.879 256+0 records in 00:07:53.879 256+0 records out 00:07:53.879 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0158452 s, 66.2 MB/s 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.879 13:16:34 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.139 13:16:34 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:54.398 13:16:35 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.399 13:16:35 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:54.399 13:16:35 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:54.399 13:16:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:54.658 13:16:35 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:54.658 13:16:35 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:07:54.918 13:16:35 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:07:54.918 [2024-07-25 13:16:35.621877] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:07:54.918 [2024-07-25 13:16:35.684235] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:07:54.918 [2024-07-25 13:16:35.684240] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:07:55.179 [2024-07-25 13:16:35.714884] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:07:55.179 [2024-07-25 13:16:35.714919] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:07:57.791 13:16:38 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:07:57.791 13:16:38 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:07:57.791 spdk_app_start Round 1 00:07:57.791 13:16:38 event.app_repeat -- event/event.sh@25 -- # waitforlisten 833660 /var/tmp/spdk-nbd.sock 00:07:57.791 13:16:38 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 833660 ']' 00:07:57.791 13:16:38 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:57.791 13:16:38 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:07:57.791 13:16:38 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:57.791 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:57.791 13:16:38 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:07:57.791 13:16:38 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:07:58.051 13:16:38 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:07:58.051 13:16:38 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:07:58.051 13:16:38 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:58.311 Malloc0 00:07:58.311 13:16:38 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:07:58.572 Malloc1 00:07:58.572 13:16:39 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:58.572 13:16:39 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:58.573 /dev/nbd0 00:07:58.573 13:16:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:58.833 13:16:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:58.833 13:16:39 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:58.833 1+0 records in 00:07:58.833 1+0 records out 00:07:58.833 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217052 s, 18.9 MB/s 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:07:58.834 /dev/nbd1 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:07:58.834 1+0 records in 00:07:58.834 1+0 records out 00:07:58.834 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000227318 s, 18.0 MB/s 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:07:58.834 13:16:39 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:58.834 13:16:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:59.094 { 00:07:59.094 "nbd_device": "/dev/nbd0", 00:07:59.094 "bdev_name": "Malloc0" 00:07:59.094 }, 00:07:59.094 { 00:07:59.094 "nbd_device": "/dev/nbd1", 00:07:59.094 "bdev_name": "Malloc1" 00:07:59.094 } 00:07:59.094 ]' 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:59.094 { 00:07:59.094 "nbd_device": "/dev/nbd0", 00:07:59.094 "bdev_name": "Malloc0" 00:07:59.094 }, 00:07:59.094 { 00:07:59.094 "nbd_device": "/dev/nbd1", 00:07:59.094 "bdev_name": "Malloc1" 00:07:59.094 } 00:07:59.094 ]' 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:59.094 /dev/nbd1' 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:59.094 /dev/nbd1' 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:59.094 13:16:39 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:07:59.354 256+0 records in 00:07:59.354 256+0 records out 00:07:59.354 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0120821 s, 86.8 MB/s 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:59.354 256+0 records in 00:07:59.354 256+0 records out 00:07:59.354 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0146719 s, 71.5 MB/s 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:59.354 256+0 records in 00:07:59.354 256+0 records out 00:07:59.354 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.015172 s, 69.1 MB/s 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:59.354 13:16:39 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:59.614 13:16:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:59.874 13:16:40 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:07:59.874 13:16:40 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:00.135 13:16:40 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:00.395 [2024-07-25 13:16:40.996747] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:00.395 [2024-07-25 13:16:41.060622] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:00.395 [2024-07-25 13:16:41.060626] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:00.395 [2024-07-25 13:16:41.092122] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:00.395 [2024-07-25 13:16:41.092156] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:03.692 13:16:43 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:08:03.692 13:16:43 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:08:03.692 spdk_app_start Round 2 00:08:03.692 13:16:43 event.app_repeat -- event/event.sh@25 -- # waitforlisten 833660 /var/tmp/spdk-nbd.sock 00:08:03.692 13:16:43 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 833660 ']' 00:08:03.692 13:16:43 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:03.692 13:16:43 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:03.692 13:16:43 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:03.692 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:03.692 13:16:43 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:03.692 13:16:43 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:03.692 13:16:44 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:03.692 13:16:44 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:08:03.692 13:16:44 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:03.692 Malloc0 00:08:03.692 13:16:44 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:08:03.953 Malloc1 00:08:03.953 13:16:44 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:08:03.953 /dev/nbd0 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:08:03.953 13:16:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:08:03.953 13:16:44 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:08:03.953 13:16:44 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:08:03.953 13:16:44 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:03.953 13:16:44 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:03.953 13:16:44 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:04.213 1+0 records in 00:08:04.213 1+0 records out 00:08:04.213 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021358 s, 19.2 MB/s 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:04.213 13:16:44 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:08:04.213 13:16:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:04.213 13:16:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:04.213 13:16:44 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:08:04.213 /dev/nbd1 00:08:04.214 13:16:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:08:04.214 13:16:44 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@869 -- # local i 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@873 -- # break 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:08:04.214 1+0 records in 00:08:04.214 1+0 records out 00:08:04.214 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000131611 s, 31.1 MB/s 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@886 -- # size=4096 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:08:04.214 13:16:44 event.app_repeat -- common/autotest_common.sh@889 -- # return 0 00:08:04.214 13:16:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:08:04.214 13:16:44 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:08:04.214 13:16:44 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:04.214 13:16:44 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:04.214 13:16:44 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:08:04.474 { 00:08:04.474 "nbd_device": "/dev/nbd0", 00:08:04.474 "bdev_name": "Malloc0" 00:08:04.474 }, 00:08:04.474 { 00:08:04.474 "nbd_device": "/dev/nbd1", 00:08:04.474 "bdev_name": "Malloc1" 00:08:04.474 } 00:08:04.474 ]' 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:08:04.474 { 00:08:04.474 "nbd_device": "/dev/nbd0", 00:08:04.474 "bdev_name": "Malloc0" 00:08:04.474 }, 00:08:04.474 { 00:08:04.474 "nbd_device": "/dev/nbd1", 00:08:04.474 "bdev_name": "Malloc1" 00:08:04.474 } 00:08:04.474 ]' 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:08:04.474 /dev/nbd1' 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:08:04.474 /dev/nbd1' 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:08:04.474 13:16:45 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:08:04.734 256+0 records in 00:08:04.734 256+0 records out 00:08:04.734 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125314 s, 83.7 MB/s 00:08:04.734 13:16:45 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:04.734 13:16:45 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:08:04.734 256+0 records in 00:08:04.734 256+0 records out 00:08:04.734 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0142899 s, 73.4 MB/s 00:08:04.734 13:16:45 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:08:04.734 13:16:45 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:08:04.734 256+0 records in 00:08:04.734 256+0 records out 00:08:04.734 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0156819 s, 66.9 MB/s 00:08:04.734 13:16:45 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:08:04.734 13:16:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:04.734 13:16:45 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:04.735 13:16:45 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:08:04.995 13:16:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:08:05.255 13:16:45 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:08:05.255 13:16:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:08:05.255 13:16:45 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:08:05.255 13:16:46 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:08:05.255 13:16:46 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:08:05.515 13:16:46 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:08:05.775 [2024-07-25 13:16:46.377713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:05.775 [2024-07-25 13:16:46.440177] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:05.775 [2024-07-25 13:16:46.440181] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:05.775 [2024-07-25 13:16:46.470807] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:08:05.775 [2024-07-25 13:16:46.470843] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:08:09.071 13:16:49 event.app_repeat -- event/event.sh@38 -- # waitforlisten 833660 /var/tmp/spdk-nbd.sock 00:08:09.071 13:16:49 event.app_repeat -- common/autotest_common.sh@831 -- # '[' -z 833660 ']' 00:08:09.071 13:16:49 event.app_repeat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:08:09.071 13:16:49 event.app_repeat -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:09.071 13:16:49 event.app_repeat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:08:09.071 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:08:09.071 13:16:49 event.app_repeat -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:09.071 13:16:49 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:09.071 13:16:49 event.app_repeat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@864 -- # return 0 00:08:09.072 13:16:49 event.app_repeat -- event/event.sh@39 -- # killprocess 833660 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@950 -- # '[' -z 833660 ']' 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@954 -- # kill -0 833660 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@955 -- # uname 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 833660 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 833660' 00:08:09.072 killing process with pid 833660 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@969 -- # kill 833660 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@974 -- # wait 833660 00:08:09.072 spdk_app_start is called in Round 0. 00:08:09.072 Shutdown signal received, stop current app iteration 00:08:09.072 Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 reinitialization... 00:08:09.072 spdk_app_start is called in Round 1. 00:08:09.072 Shutdown signal received, stop current app iteration 00:08:09.072 Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 reinitialization... 00:08:09.072 spdk_app_start is called in Round 2. 00:08:09.072 Shutdown signal received, stop current app iteration 00:08:09.072 Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 reinitialization... 00:08:09.072 spdk_app_start is called in Round 3. 00:08:09.072 Shutdown signal received, stop current app iteration 00:08:09.072 13:16:49 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:08:09.072 13:16:49 event.app_repeat -- event/event.sh@42 -- # return 0 00:08:09.072 00:08:09.072 real 0m17.080s 00:08:09.072 user 0m37.881s 00:08:09.072 sys 0m2.435s 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:09.072 13:16:49 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:08:09.072 ************************************ 00:08:09.072 END TEST app_repeat 00:08:09.072 ************************************ 00:08:09.072 13:16:49 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:08:09.072 00:08:09.072 real 0m27.111s 00:08:09.072 user 0m56.292s 00:08:09.072 sys 0m3.556s 00:08:09.072 13:16:49 event -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:09.072 13:16:49 event -- common/autotest_common.sh@10 -- # set +x 00:08:09.072 ************************************ 00:08:09.072 END TEST event 00:08:09.072 ************************************ 00:08:09.072 13:16:49 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:08:09.072 13:16:49 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:09.072 13:16:49 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:09.072 13:16:49 -- common/autotest_common.sh@10 -- # set +x 00:08:09.072 ************************************ 00:08:09.072 START TEST thread 00:08:09.072 ************************************ 00:08:09.072 13:16:49 thread -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:08:09.072 * Looking for test storage... 00:08:09.072 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:08:09.072 13:16:49 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:09.072 13:16:49 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:08:09.072 13:16:49 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:09.072 13:16:49 thread -- common/autotest_common.sh@10 -- # set +x 00:08:09.072 ************************************ 00:08:09.072 START TEST thread_poller_perf 00:08:09.072 ************************************ 00:08:09.072 13:16:49 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:08:09.333 [2024-07-25 13:16:49.877332] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:09.333 [2024-07-25 13:16:49.877423] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid836737 ] 00:08:09.333 [2024-07-25 13:16:49.969425] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:09.333 [2024-07-25 13:16:50.040913] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:09.333 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:08:10.714 ====================================== 00:08:10.714 busy:2614173518 (cyc) 00:08:10.714 total_run_count: 312000 00:08:10.714 tsc_hz: 2600000000 (cyc) 00:08:10.714 ====================================== 00:08:10.714 poller_cost: 8378 (cyc), 3222 (nsec) 00:08:10.714 00:08:10.714 real 0m1.253s 00:08:10.714 user 0m1.156s 00:08:10.714 sys 0m0.093s 00:08:10.714 13:16:51 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:10.714 13:16:51 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:10.714 ************************************ 00:08:10.714 END TEST thread_poller_perf 00:08:10.714 ************************************ 00:08:10.714 13:16:51 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:10.714 13:16:51 thread -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:08:10.714 13:16:51 thread -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:10.714 13:16:51 thread -- common/autotest_common.sh@10 -- # set +x 00:08:10.714 ************************************ 00:08:10.714 START TEST thread_poller_perf 00:08:10.714 ************************************ 00:08:10.714 13:16:51 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:08:10.714 [2024-07-25 13:16:51.204783] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:10.714 [2024-07-25 13:16:51.204877] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid836971 ] 00:08:10.714 [2024-07-25 13:16:51.293795] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:10.714 [2024-07-25 13:16:51.357569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:10.714 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:08:11.655 ====================================== 00:08:11.655 busy:2602104520 (cyc) 00:08:11.655 total_run_count: 4117000 00:08:11.655 tsc_hz: 2600000000 (cyc) 00:08:11.655 ====================================== 00:08:11.655 poller_cost: 632 (cyc), 243 (nsec) 00:08:11.655 00:08:11.655 real 0m1.234s 00:08:11.655 user 0m1.139s 00:08:11.655 sys 0m0.091s 00:08:11.655 13:16:52 thread.thread_poller_perf -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:11.655 13:16:52 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:08:11.655 ************************************ 00:08:11.655 END TEST thread_poller_perf 00:08:11.655 ************************************ 00:08:11.915 13:16:52 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:08:11.915 00:08:11.915 real 0m2.742s 00:08:11.915 user 0m2.388s 00:08:11.915 sys 0m0.359s 00:08:11.915 13:16:52 thread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:11.915 13:16:52 thread -- common/autotest_common.sh@10 -- # set +x 00:08:11.915 ************************************ 00:08:11.915 END TEST thread 00:08:11.915 ************************************ 00:08:11.915 13:16:52 -- spdk/autotest.sh@184 -- # [[ 1 -eq 1 ]] 00:08:11.915 13:16:52 -- spdk/autotest.sh@185 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:11.915 13:16:52 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:11.915 13:16:52 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:11.915 13:16:52 -- common/autotest_common.sh@10 -- # set +x 00:08:11.915 ************************************ 00:08:11.915 START TEST accel 00:08:11.915 ************************************ 00:08:11.915 13:16:52 accel -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:08:11.915 * Looking for test storage... 00:08:11.915 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:11.915 13:16:52 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:08:11.915 13:16:52 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:08:11.915 13:16:52 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:11.915 13:16:52 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=837327 00:08:11.915 13:16:52 accel -- accel/accel.sh@63 -- # waitforlisten 837327 00:08:11.915 13:16:52 accel -- common/autotest_common.sh@831 -- # '[' -z 837327 ']' 00:08:11.915 13:16:52 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:11.915 13:16:52 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:11.916 13:16:52 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:11.916 13:16:52 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:11.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:11.916 13:16:52 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:11.916 13:16:52 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:11.916 13:16:52 accel -- common/autotest_common.sh@10 -- # set +x 00:08:11.916 13:16:52 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:11.916 13:16:52 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:11.916 13:16:52 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:11.916 13:16:52 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:11.916 13:16:52 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:11.916 13:16:52 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:11.916 13:16:52 accel -- accel/accel.sh@41 -- # jq -r . 00:08:11.916 [2024-07-25 13:16:52.692043] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:11.916 [2024-07-25 13:16:52.692103] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid837327 ] 00:08:12.176 [2024-07-25 13:16:52.767109] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:12.176 [2024-07-25 13:16:52.829445] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:12.747 13:16:53 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:12.747 13:16:53 accel -- common/autotest_common.sh@864 -- # return 0 00:08:12.747 13:16:53 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:12.747 13:16:53 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:12.747 13:16:53 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:12.747 13:16:53 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:08:12.747 13:16:53 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:12.747 13:16:53 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:12.747 13:16:53 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:12.747 13:16:53 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:12.747 13:16:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.006 13:16:53 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:13.006 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.006 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.006 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.006 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.006 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.006 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # IFS== 00:08:13.007 13:16:53 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:13.007 13:16:53 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:13.007 13:16:53 accel -- accel/accel.sh@75 -- # killprocess 837327 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@950 -- # '[' -z 837327 ']' 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@954 -- # kill -0 837327 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@955 -- # uname 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 837327 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 837327' 00:08:13.007 killing process with pid 837327 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@969 -- # kill 837327 00:08:13.007 13:16:53 accel -- common/autotest_common.sh@974 -- # wait 837327 00:08:13.267 13:16:53 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:13.267 13:16:53 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:08:13.267 13:16:53 accel -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:08:13.267 13:16:53 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:13.267 13:16:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.267 13:16:53 accel.accel_help -- common/autotest_common.sh@1125 -- # accel_perf -h 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:08:13.267 13:16:53 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:08:13.267 13:16:53 accel.accel_help -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:13.267 13:16:53 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:08:13.267 13:16:53 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:08:13.267 13:16:53 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:13.267 13:16:53 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:13.267 13:16:53 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.267 ************************************ 00:08:13.267 START TEST accel_missing_filename 00:08:13.267 ************************************ 00:08:13.267 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress 00:08:13.267 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@650 -- # local es=0 00:08:13.267 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress 00:08:13.267 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:13.267 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:13.267 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:13.268 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:13.268 13:16:53 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:08:13.268 13:16:53 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:08:13.268 [2024-07-25 13:16:54.022000] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:13.268 [2024-07-25 13:16:54.022084] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid837667 ] 00:08:13.528 [2024-07-25 13:16:54.116357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:13.528 [2024-07-25 13:16:54.190408] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:13.528 [2024-07-25 13:16:54.233834] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:13.528 [2024-07-25 13:16:54.270956] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:08:13.528 A filename is required. 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@653 -- # es=234 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # es=106 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@663 -- # case "$es" in 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@670 -- # es=1 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:13.788 00:08:13.788 real 0m0.335s 00:08:13.788 user 0m0.235s 00:08:13.788 sys 0m0.132s 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:13.788 13:16:54 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:08:13.788 ************************************ 00:08:13.788 END TEST accel_missing_filename 00:08:13.788 ************************************ 00:08:13.788 13:16:54 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:13.788 13:16:54 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:08:13.788 13:16:54 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:13.788 13:16:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:13.788 ************************************ 00:08:13.788 START TEST accel_compress_verify 00:08:13.788 ************************************ 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@650 -- # local es=0 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:13.788 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:13.788 13:16:54 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:08:13.788 [2024-07-25 13:16:54.431491] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:13.788 [2024-07-25 13:16:54.431561] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid837690 ] 00:08:13.788 [2024-07-25 13:16:54.522739] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.049 [2024-07-25 13:16:54.598402] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.049 [2024-07-25 13:16:54.641823] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:14.049 [2024-07-25 13:16:54.680023] accel_perf.c:1463:main: *ERROR*: ERROR starting application 00:08:14.049 00:08:14.049 Compression does not support the verify option, aborting. 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@653 -- # es=161 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # es=33 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@663 -- # case "$es" in 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@670 -- # es=1 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:14.049 00:08:14.049 real 0m0.337s 00:08:14.049 user 0m0.239s 00:08:14.049 sys 0m0.129s 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:14.049 13:16:54 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:08:14.049 ************************************ 00:08:14.049 END TEST accel_compress_verify 00:08:14.049 ************************************ 00:08:14.049 13:16:54 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:08:14.049 13:16:54 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:14.049 13:16:54 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:14.049 13:16:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.049 ************************************ 00:08:14.049 START TEST accel_wrong_workload 00:08:14.049 ************************************ 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w foobar 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@650 -- # local es=0 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:14.049 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w foobar 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:08:14.049 13:16:54 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:08:14.049 Unsupported workload type: foobar 00:08:14.049 [2024-07-25 13:16:54.840349] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:08:14.311 accel_perf options: 00:08:14.311 [-h help message] 00:08:14.311 [-q queue depth per core] 00:08:14.311 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:14.311 [-T number of threads per core 00:08:14.311 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:14.311 [-t time in seconds] 00:08:14.311 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:14.311 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:14.311 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:14.311 [-l for compress/decompress workloads, name of uncompressed input file 00:08:14.311 [-S for crc32c workload, use this seed value (default 0) 00:08:14.311 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:14.311 [-f for fill workload, use this BYTE value (default 255) 00:08:14.311 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:14.311 [-y verify result if this switch is on] 00:08:14.311 [-a tasks to allocate per core (default: same value as -q)] 00:08:14.311 Can be used to spread operations across a wider range of memory. 00:08:14.311 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@653 -- # es=1 00:08:14.311 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:14.311 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:08:14.311 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:14.311 00:08:14.311 real 0m0.041s 00:08:14.311 user 0m0.025s 00:08:14.311 sys 0m0.016s 00:08:14.311 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:14.311 13:16:54 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:08:14.311 ************************************ 00:08:14.311 END TEST accel_wrong_workload 00:08:14.311 ************************************ 00:08:14.311 Error: writing output failed: Broken pipe 00:08:14.311 13:16:54 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:08:14.311 13:16:54 accel -- common/autotest_common.sh@1101 -- # '[' 10 -le 1 ']' 00:08:14.311 13:16:54 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:14.311 13:16:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.311 ************************************ 00:08:14.311 START TEST accel_negative_buffers 00:08:14.311 ************************************ 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@650 -- # local es=0 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@638 -- # local arg=accel_perf 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # type -t accel_perf 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # accel_perf -t 1 -w xor -y -x -1 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:08:14.311 13:16:54 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:08:14.311 -x option must be non-negative. 00:08:14.311 [2024-07-25 13:16:54.956480] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:08:14.311 accel_perf options: 00:08:14.311 [-h help message] 00:08:14.311 [-q queue depth per core] 00:08:14.311 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:08:14.311 [-T number of threads per core 00:08:14.311 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:08:14.311 [-t time in seconds] 00:08:14.311 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:08:14.311 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:08:14.311 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:08:14.311 [-l for compress/decompress workloads, name of uncompressed input file 00:08:14.311 [-S for crc32c workload, use this seed value (default 0) 00:08:14.311 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:08:14.311 [-f for fill workload, use this BYTE value (default 255) 00:08:14.311 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:08:14.311 [-y verify result if this switch is on] 00:08:14.311 [-a tasks to allocate per core (default: same value as -q)] 00:08:14.311 Can be used to spread operations across a wider range of memory. 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@653 -- # es=1 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:08:14.311 00:08:14.311 real 0m0.042s 00:08:14.311 user 0m0.029s 00:08:14.311 sys 0m0.012s 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:14.311 13:16:54 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:08:14.311 ************************************ 00:08:14.311 END TEST accel_negative_buffers 00:08:14.311 ************************************ 00:08:14.311 Error: writing output failed: Broken pipe 00:08:14.311 13:16:54 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:08:14.311 13:16:54 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:14.311 13:16:54 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:14.311 13:16:54 accel -- common/autotest_common.sh@10 -- # set +x 00:08:14.311 ************************************ 00:08:14.311 START TEST accel_crc32c 00:08:14.311 ************************************ 00:08:14.311 13:16:55 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -S 32 -y 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:14.311 13:16:55 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:14.311 [2024-07-25 13:16:55.065776] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:14.311 [2024-07-25 13:16:55.065840] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid837774 ] 00:08:14.573 [2024-07-25 13:16:55.154923] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:14.573 [2024-07-25 13:16:55.229810] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:14.573 13:16:55 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:15.957 13:16:56 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:15.957 00:08:15.957 real 0m1.334s 00:08:15.957 user 0m1.206s 00:08:15.957 sys 0m0.125s 00:08:15.957 13:16:56 accel.accel_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:15.957 13:16:56 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:15.957 ************************************ 00:08:15.957 END TEST accel_crc32c 00:08:15.957 ************************************ 00:08:15.957 13:16:56 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:08:15.957 13:16:56 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:15.957 13:16:56 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:15.957 13:16:56 accel -- common/autotest_common.sh@10 -- # set +x 00:08:15.957 ************************************ 00:08:15.957 START TEST accel_crc32c_C2 00:08:15.957 ************************************ 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w crc32c -y -C 2 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:15.957 [2024-07-25 13:16:56.482783] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:15.957 [2024-07-25 13:16:56.482894] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid838074 ] 00:08:15.957 [2024-07-25 13:16:56.579886] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:15.957 [2024-07-25 13:16:56.649337] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.957 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:15.958 13:16:56 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:17.341 00:08:17.341 real 0m1.344s 00:08:17.341 user 0m1.203s 00:08:17.341 sys 0m0.136s 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:17.341 13:16:57 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:17.341 ************************************ 00:08:17.341 END TEST accel_crc32c_C2 00:08:17.341 ************************************ 00:08:17.341 13:16:57 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:08:17.341 13:16:57 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:17.341 13:16:57 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:17.341 13:16:57 accel -- common/autotest_common.sh@10 -- # set +x 00:08:17.341 ************************************ 00:08:17.341 START TEST accel_copy 00:08:17.341 ************************************ 00:08:17.341 13:16:57 accel.accel_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy -y 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:17.341 13:16:57 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:08:17.341 [2024-07-25 13:16:57.902303] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:17.341 [2024-07-25 13:16:57.902358] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid838400 ] 00:08:17.341 [2024-07-25 13:16:57.992339] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:17.341 [2024-07-25 13:16:58.066296] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.341 13:16:58 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:17.342 13:16:58 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:18.724 13:16:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:18.724 13:16:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:18.724 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:08:18.725 13:16:59 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:18.725 00:08:18.725 real 0m1.335s 00:08:18.725 user 0m1.196s 00:08:18.725 sys 0m0.138s 00:08:18.725 13:16:59 accel.accel_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:18.725 13:16:59 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:08:18.725 ************************************ 00:08:18.725 END TEST accel_copy 00:08:18.725 ************************************ 00:08:18.725 13:16:59 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:18.725 13:16:59 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:18.725 13:16:59 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:18.725 13:16:59 accel -- common/autotest_common.sh@10 -- # set +x 00:08:18.725 ************************************ 00:08:18.725 START TEST accel_fill 00:08:18.725 ************************************ 00:08:18.725 13:16:59 accel.accel_fill -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:08:18.725 13:16:59 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:08:18.725 [2024-07-25 13:16:59.315080] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:18.725 [2024-07-25 13:16:59.315146] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid838642 ] 00:08:18.725 [2024-07-25 13:16:59.408827] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:18.725 [2024-07-25 13:16:59.484622] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:08:18.985 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:18.986 13:16:59 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:08:19.925 13:17:00 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:19.925 00:08:19.925 real 0m1.342s 00:08:19.925 user 0m1.210s 00:08:19.925 sys 0m0.133s 00:08:19.925 13:17:00 accel.accel_fill -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:19.925 13:17:00 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:08:19.925 ************************************ 00:08:19.925 END TEST accel_fill 00:08:19.925 ************************************ 00:08:19.925 13:17:00 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:08:19.925 13:17:00 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:19.925 13:17:00 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:19.925 13:17:00 accel -- common/autotest_common.sh@10 -- # set +x 00:08:19.925 ************************************ 00:08:19.925 START TEST accel_copy_crc32c 00:08:19.925 ************************************ 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:08:19.925 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:08:20.186 [2024-07-25 13:17:00.731878] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:20.186 [2024-07-25 13:17:00.731935] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid838768 ] 00:08:20.186 [2024-07-25 13:17:00.820376] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:20.186 [2024-07-25 13:17:00.890102] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:20.186 13:17:00 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:21.571 00:08:21.571 real 0m1.339s 00:08:21.571 user 0m1.199s 00:08:21.571 sys 0m0.124s 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:21.571 13:17:02 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:08:21.571 ************************************ 00:08:21.571 END TEST accel_copy_crc32c 00:08:21.571 ************************************ 00:08:21.571 13:17:02 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:08:21.571 13:17:02 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:21.571 13:17:02 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:21.571 13:17:02 accel -- common/autotest_common.sh@10 -- # set +x 00:08:21.571 ************************************ 00:08:21.571 START TEST accel_copy_crc32c_C2 00:08:21.571 ************************************ 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:08:21.571 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:08:21.571 [2024-07-25 13:17:02.151311] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:21.571 [2024-07-25 13:17:02.151428] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid839075 ] 00:08:21.571 [2024-07-25 13:17:02.291686] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:21.832 [2024-07-25 13:17:02.367562] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:21.832 13:17:02 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:22.779 00:08:22.779 real 0m1.392s 00:08:22.779 user 0m1.220s 00:08:22.779 sys 0m0.165s 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:22.779 13:17:03 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:08:22.779 ************************************ 00:08:22.779 END TEST accel_copy_crc32c_C2 00:08:22.779 ************************************ 00:08:22.779 13:17:03 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:08:22.779 13:17:03 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:22.779 13:17:03 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:22.779 13:17:03 accel -- common/autotest_common.sh@10 -- # set +x 00:08:23.071 ************************************ 00:08:23.071 START TEST accel_dualcast 00:08:23.071 ************************************ 00:08:23.071 13:17:03 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dualcast -y 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:08:23.071 [2024-07-25 13:17:03.606691] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:23.071 [2024-07-25 13:17:03.606753] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid839390 ] 00:08:23.071 [2024-07-25 13:17:03.695526] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:23.071 [2024-07-25 13:17:03.768959] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.071 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:23.072 13:17:03 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:08:24.457 13:17:04 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:24.457 00:08:24.457 real 0m1.325s 00:08:24.457 user 0m1.195s 00:08:24.457 sys 0m0.127s 00:08:24.457 13:17:04 accel.accel_dualcast -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:24.457 13:17:04 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:08:24.457 ************************************ 00:08:24.457 END TEST accel_dualcast 00:08:24.457 ************************************ 00:08:24.457 13:17:04 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:08:24.457 13:17:04 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:24.457 13:17:04 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:24.457 13:17:04 accel -- common/autotest_common.sh@10 -- # set +x 00:08:24.457 ************************************ 00:08:24.457 START TEST accel_compare 00:08:24.457 ************************************ 00:08:24.457 13:17:04 accel.accel_compare -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compare -y 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:08:24.457 13:17:04 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:08:24.457 [2024-07-25 13:17:05.007259] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:24.457 [2024-07-25 13:17:05.007347] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid839716 ] 00:08:24.457 [2024-07-25 13:17:05.101155] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:24.457 [2024-07-25 13:17:05.177417] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:24.457 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.457 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.457 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:24.457 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.457 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:24.458 13:17:05 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:08:25.840 13:17:06 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:25.840 00:08:25.840 real 0m1.350s 00:08:25.840 user 0m1.200s 00:08:25.840 sys 0m0.136s 00:08:25.840 13:17:06 accel.accel_compare -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:25.840 13:17:06 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:08:25.840 ************************************ 00:08:25.840 END TEST accel_compare 00:08:25.840 ************************************ 00:08:25.840 13:17:06 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:08:25.840 13:17:06 accel -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:08:25.840 13:17:06 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:25.840 13:17:06 accel -- common/autotest_common.sh@10 -- # set +x 00:08:25.840 ************************************ 00:08:25.840 START TEST accel_xor 00:08:25.840 ************************************ 00:08:25.840 13:17:06 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:25.840 [2024-07-25 13:17:06.426796] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:25.840 [2024-07-25 13:17:06.426877] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid839898 ] 00:08:25.840 [2024-07-25 13:17:06.513146] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:25.840 [2024-07-25 13:17:06.588136] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:25.840 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:26.101 13:17:06 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.040 13:17:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:27.041 00:08:27.041 real 0m1.325s 00:08:27.041 user 0m1.191s 00:08:27.041 sys 0m0.130s 00:08:27.041 13:17:07 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:27.041 13:17:07 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:27.041 ************************************ 00:08:27.041 END TEST accel_xor 00:08:27.041 ************************************ 00:08:27.041 13:17:07 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:08:27.041 13:17:07 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:27.041 13:17:07 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:27.041 13:17:07 accel -- common/autotest_common.sh@10 -- # set +x 00:08:27.041 ************************************ 00:08:27.041 START TEST accel_xor 00:08:27.041 ************************************ 00:08:27.041 13:17:07 accel.accel_xor -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w xor -y -x 3 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:08:27.041 13:17:07 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:08:27.041 [2024-07-25 13:17:07.824901] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:27.041 [2024-07-25 13:17:07.824959] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid840085 ] 00:08:27.301 [2024-07-25 13:17:07.916098] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:27.301 [2024-07-25 13:17:07.989811] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.301 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:27.302 13:17:08 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:08:28.683 13:17:09 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:28.683 00:08:28.683 real 0m1.341s 00:08:28.683 user 0m1.206s 00:08:28.683 sys 0m0.122s 00:08:28.683 13:17:09 accel.accel_xor -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:28.683 13:17:09 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:08:28.683 ************************************ 00:08:28.683 END TEST accel_xor 00:08:28.683 ************************************ 00:08:28.683 13:17:09 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:08:28.683 13:17:09 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:28.683 13:17:09 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:28.683 13:17:09 accel -- common/autotest_common.sh@10 -- # set +x 00:08:28.683 ************************************ 00:08:28.683 START TEST accel_dif_verify 00:08:28.683 ************************************ 00:08:28.683 13:17:09 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_verify 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:08:28.683 [2024-07-25 13:17:09.238356] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:28.683 [2024-07-25 13:17:09.238423] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid840390 ] 00:08:28.683 [2024-07-25 13:17:09.329134] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:28.683 [2024-07-25 13:17:09.404238] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.683 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:28.684 13:17:09 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:08:30.065 13:17:10 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:30.065 00:08:30.065 real 0m1.332s 00:08:30.065 user 0m1.194s 00:08:30.065 sys 0m0.133s 00:08:30.065 13:17:10 accel.accel_dif_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:30.065 13:17:10 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:08:30.065 ************************************ 00:08:30.065 END TEST accel_dif_verify 00:08:30.065 ************************************ 00:08:30.065 13:17:10 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:08:30.066 13:17:10 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:30.066 13:17:10 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:30.066 13:17:10 accel -- common/autotest_common.sh@10 -- # set +x 00:08:30.066 ************************************ 00:08:30.066 START TEST accel_dif_generate 00:08:30.066 ************************************ 00:08:30.066 13:17:10 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:08:30.066 [2024-07-25 13:17:10.644218] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:30.066 [2024-07-25 13:17:10.644282] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid840707 ] 00:08:30.066 [2024-07-25 13:17:10.736227] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:30.066 [2024-07-25 13:17:10.810085] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:08:30.066 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.325 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:30.326 13:17:10 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:08:31.265 13:17:11 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:31.265 00:08:31.265 real 0m1.335s 00:08:31.265 user 0m1.199s 00:08:31.265 sys 0m0.130s 00:08:31.265 13:17:11 accel.accel_dif_generate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:31.265 13:17:11 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:08:31.265 ************************************ 00:08:31.265 END TEST accel_dif_generate 00:08:31.265 ************************************ 00:08:31.265 13:17:11 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:08:31.265 13:17:11 accel -- common/autotest_common.sh@1101 -- # '[' 6 -le 1 ']' 00:08:31.265 13:17:11 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:31.265 13:17:11 accel -- common/autotest_common.sh@10 -- # set +x 00:08:31.265 ************************************ 00:08:31.265 START TEST accel_dif_generate_copy 00:08:31.265 ************************************ 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w dif_generate_copy 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:08:31.265 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:08:31.265 [2024-07-25 13:17:12.051438] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:31.265 [2024-07-25 13:17:12.051504] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid840976 ] 00:08:31.524 [2024-07-25 13:17:12.139797] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:31.524 [2024-07-25 13:17:12.215911] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:31.524 13:17:12 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:32.904 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:32.905 00:08:32.905 real 0m1.328s 00:08:32.905 user 0m1.200s 00:08:32.905 sys 0m0.126s 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:32.905 13:17:13 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:08:32.905 ************************************ 00:08:32.905 END TEST accel_dif_generate_copy 00:08:32.905 ************************************ 00:08:32.905 13:17:13 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:08:32.905 13:17:13 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:32.905 13:17:13 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:08:32.905 13:17:13 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:32.905 13:17:13 accel -- common/autotest_common.sh@10 -- # set +x 00:08:32.905 ************************************ 00:08:32.905 START TEST accel_comp 00:08:32.905 ************************************ 00:08:32.905 13:17:13 accel.accel_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:08:32.905 [2024-07-25 13:17:13.450043] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:32.905 [2024-07-25 13:17:13.450093] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid841085 ] 00:08:32.905 [2024-07-25 13:17:13.540231] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:32.905 [2024-07-25 13:17:13.615775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.905 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:32.906 13:17:13 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:34.285 13:17:14 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:34.286 13:17:14 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:34.286 00:08:34.286 real 0m1.333s 00:08:34.286 user 0m1.199s 00:08:34.286 sys 0m0.129s 00:08:34.286 13:17:14 accel.accel_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:34.286 13:17:14 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:08:34.286 ************************************ 00:08:34.286 END TEST accel_comp 00:08:34.286 ************************************ 00:08:34.286 13:17:14 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:34.286 13:17:14 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:34.286 13:17:14 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:34.286 13:17:14 accel -- common/autotest_common.sh@10 -- # set +x 00:08:34.286 ************************************ 00:08:34.286 START TEST accel_decomp 00:08:34.286 ************************************ 00:08:34.286 13:17:14 accel.accel_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:34.286 13:17:14 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:34.286 [2024-07-25 13:17:14.859242] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:34.286 [2024-07-25 13:17:14.859309] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid841379 ] 00:08:34.286 [2024-07-25 13:17:14.951809] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:34.286 [2024-07-25 13:17:15.026449] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.286 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:34.547 13:17:15 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:35.487 13:17:16 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:35.487 00:08:35.487 real 0m1.337s 00:08:35.487 user 0m1.200s 00:08:35.487 sys 0m0.132s 00:08:35.487 13:17:16 accel.accel_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:35.487 13:17:16 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:35.487 ************************************ 00:08:35.487 END TEST accel_decomp 00:08:35.487 ************************************ 00:08:35.487 13:17:16 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:35.487 13:17:16 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:35.487 13:17:16 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:35.487 13:17:16 accel -- common/autotest_common.sh@10 -- # set +x 00:08:35.487 ************************************ 00:08:35.487 START TEST accel_decomp_full 00:08:35.487 ************************************ 00:08:35.487 13:17:16 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:35.487 13:17:16 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:35.488 [2024-07-25 13:17:16.271758] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:35.488 [2024-07-25 13:17:16.271835] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid841701 ] 00:08:35.747 [2024-07-25 13:17:16.361557] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:35.747 [2024-07-25 13:17:16.431758] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.747 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:35.748 13:17:16 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.129 13:17:17 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:37.130 13:17:17 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:37.130 00:08:37.130 real 0m1.338s 00:08:37.130 user 0m1.210s 00:08:37.130 sys 0m0.125s 00:08:37.130 13:17:17 accel.accel_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:37.130 13:17:17 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:37.130 ************************************ 00:08:37.130 END TEST accel_decomp_full 00:08:37.130 ************************************ 00:08:37.130 13:17:17 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:37.130 13:17:17 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:37.130 13:17:17 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:37.130 13:17:17 accel -- common/autotest_common.sh@10 -- # set +x 00:08:37.130 ************************************ 00:08:37.130 START TEST accel_decomp_mcore 00:08:37.130 ************************************ 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:37.130 [2024-07-25 13:17:17.682553] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:37.130 [2024-07-25 13:17:17.682630] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid842015 ] 00:08:37.130 [2024-07-25 13:17:17.773149] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:37.130 [2024-07-25 13:17:17.849616] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:37.130 [2024-07-25 13:17:17.849762] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:37.130 [2024-07-25 13:17:17.849907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:37.130 [2024-07-25 13:17:17.849907] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.130 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.131 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.131 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.131 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:37.131 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:37.131 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:37.131 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:37.131 13:17:17 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:18 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:38.512 00:08:38.512 real 0m1.357s 00:08:38.512 user 0m4.497s 00:08:38.512 sys 0m0.149s 00:08:38.512 13:17:19 accel.accel_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:38.513 13:17:19 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:38.513 ************************************ 00:08:38.513 END TEST accel_decomp_mcore 00:08:38.513 ************************************ 00:08:38.513 13:17:19 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:38.513 13:17:19 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:38.513 13:17:19 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:38.513 13:17:19 accel -- common/autotest_common.sh@10 -- # set +x 00:08:38.513 ************************************ 00:08:38.513 START TEST accel_decomp_full_mcore 00:08:38.513 ************************************ 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:38.513 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:38.513 [2024-07-25 13:17:19.119196] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:38.513 [2024-07-25 13:17:19.119261] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid842189 ] 00:08:38.513 [2024-07-25 13:17:19.212688] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:38.513 [2024-07-25 13:17:19.290002] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:38.513 [2024-07-25 13:17:19.290149] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:38.513 [2024-07-25 13:17:19.290294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.513 [2024-07-25 13:17:19.290294] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.772 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:38.773 13:17:19 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.713 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:39.714 00:08:39.714 real 0m1.417s 00:08:39.714 user 0m4.713s 00:08:39.714 sys 0m0.149s 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:39.714 13:17:20 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:39.714 ************************************ 00:08:39.714 END TEST accel_decomp_full_mcore 00:08:39.714 ************************************ 00:08:39.974 13:17:20 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:39.974 13:17:20 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:39.974 13:17:20 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:39.974 13:17:20 accel -- common/autotest_common.sh@10 -- # set +x 00:08:39.974 ************************************ 00:08:39.974 START TEST accel_decomp_mthread 00:08:39.974 ************************************ 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:39.974 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:39.974 [2024-07-25 13:17:20.613511] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:39.974 [2024-07-25 13:17:20.613575] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid842397 ] 00:08:39.974 [2024-07-25 13:17:20.703540] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:40.234 [2024-07-25 13:17:20.773939] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.234 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:40.235 13:17:20 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:41.176 00:08:41.176 real 0m1.343s 00:08:41.176 user 0m1.207s 00:08:41.176 sys 0m0.136s 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:41.176 13:17:21 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:41.176 ************************************ 00:08:41.176 END TEST accel_decomp_mthread 00:08:41.176 ************************************ 00:08:41.176 13:17:21 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:41.176 13:17:21 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:41.176 13:17:21 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:41.176 13:17:21 accel -- common/autotest_common.sh@10 -- # set +x 00:08:41.436 ************************************ 00:08:41.436 START TEST accel_decomp_full_mthread 00:08:41.436 ************************************ 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:41.436 13:17:21 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:41.436 [2024-07-25 13:17:22.031166] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:41.436 [2024-07-25 13:17:22.031225] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid842699 ] 00:08:41.436 [2024-07-25 13:17:22.118916] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:41.436 [2024-07-25 13:17:22.184556] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:41.436 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:41.697 13:17:22 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:08:42.640 00:08:42.640 real 0m1.351s 00:08:42.640 user 0m1.229s 00:08:42.640 sys 0m0.117s 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:42.640 13:17:23 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:42.640 ************************************ 00:08:42.640 END TEST accel_decomp_full_mthread 00:08:42.640 ************************************ 00:08:42.640 13:17:23 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:08:42.640 13:17:23 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:08:42.640 13:17:23 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:08:42.640 13:17:23 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:42.640 13:17:23 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=843017 00:08:42.640 13:17:23 accel -- accel/accel.sh@63 -- # waitforlisten 843017 00:08:42.640 13:17:23 accel -- common/autotest_common.sh@831 -- # '[' -z 843017 ']' 00:08:42.640 13:17:23 accel -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:42.640 13:17:23 accel -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:42.640 13:17:23 accel -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:42.640 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:42.640 13:17:23 accel -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:42.640 13:17:23 accel -- common/autotest_common.sh@10 -- # set +x 00:08:42.640 13:17:23 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:08:42.640 13:17:23 accel -- accel/accel.sh@61 -- # build_accel_config 00:08:42.640 13:17:23 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:42.640 13:17:23 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:42.640 13:17:23 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:42.640 13:17:23 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:42.640 13:17:23 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:42.640 13:17:23 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:42.640 13:17:23 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:42.640 13:17:23 accel -- accel/accel.sh@41 -- # jq -r . 00:08:42.899 [2024-07-25 13:17:23.456001] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:42.899 [2024-07-25 13:17:23.456058] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid843017 ] 00:08:42.899 [2024-07-25 13:17:23.546470] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:42.899 [2024-07-25 13:17:23.620780] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:43.469 [2024-07-25 13:17:24.015030] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@864 -- # return 0 00:08:44.041 13:17:24 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:08:44.041 13:17:24 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:08:44.041 13:17:24 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:08:44.041 13:17:24 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:08:44.041 13:17:24 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:08:44.041 13:17:24 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:08:44.041 13:17:24 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:08:44.041 13:17:24 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.041 "method": "compressdev_scan_accel_module", 00:08:44.041 13:17:24 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:08:44.041 13:17:24 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:08:44.041 13:17:24 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.041 13:17:24 accel -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:44.041 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.041 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.041 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.041 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.041 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.041 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.041 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.041 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # IFS== 00:08:44.302 13:17:24 accel -- accel/accel.sh@72 -- # read -r opc module 00:08:44.302 13:17:24 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:08:44.302 13:17:24 accel -- accel/accel.sh@75 -- # killprocess 843017 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@950 -- # '[' -z 843017 ']' 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@954 -- # kill -0 843017 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@955 -- # uname 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 843017 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@968 -- # echo 'killing process with pid 843017' 00:08:44.302 killing process with pid 843017 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@969 -- # kill 843017 00:08:44.302 13:17:24 accel -- common/autotest_common.sh@974 -- # wait 843017 00:08:44.562 13:17:25 accel -- accel/accel.sh@76 -- # trap - ERR 00:08:44.562 13:17:25 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:44.562 13:17:25 accel -- common/autotest_common.sh@1101 -- # '[' 8 -le 1 ']' 00:08:44.562 13:17:25 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:44.562 13:17:25 accel -- common/autotest_common.sh@10 -- # set +x 00:08:44.562 ************************************ 00:08:44.562 START TEST accel_cdev_comp 00:08:44.562 ************************************ 00:08:44.562 13:17:25 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:08:44.562 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:08:44.562 [2024-07-25 13:17:25.181511] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:44.562 [2024-07-25 13:17:25.181638] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid843338 ] 00:08:44.562 [2024-07-25 13:17:25.320357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:44.822 [2024-07-25 13:17:25.394985] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:45.083 [2024-07-25 13:17:25.792201] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:45.083 [2024-07-25 13:17:25.793965] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1f6f6a0 PMD being used: compress_qat 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.083 [2024-07-25 13:17:25.797010] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x21743f0 PMD being used: compress_qat 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.083 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:45.084 13:17:25 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:08:46.466 13:17:26 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:46.466 00:08:46.466 real 0m1.757s 00:08:46.466 user 0m1.412s 00:08:46.466 sys 0m0.338s 00:08:46.466 13:17:26 accel.accel_cdev_comp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:46.466 13:17:26 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:08:46.466 ************************************ 00:08:46.466 END TEST accel_cdev_comp 00:08:46.466 ************************************ 00:08:46.466 13:17:26 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:46.466 13:17:26 accel -- common/autotest_common.sh@1101 -- # '[' 9 -le 1 ']' 00:08:46.466 13:17:26 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:46.466 13:17:26 accel -- common/autotest_common.sh@10 -- # set +x 00:08:46.466 ************************************ 00:08:46.466 START TEST accel_cdev_decomp 00:08:46.466 ************************************ 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:08:46.466 13:17:26 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:08:46.466 [2024-07-25 13:17:27.011058] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:46.466 [2024-07-25 13:17:27.011179] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid843661 ] 00:08:46.466 [2024-07-25 13:17:27.152754] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:46.466 [2024-07-25 13:17:27.229385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:47.061 [2024-07-25 13:17:27.636403] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:47.061 [2024-07-25 13:17:27.638145] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xa296a0 PMD being used: compress_qat 00:08:47.061 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.061 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.061 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.061 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.061 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.061 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 [2024-07-25 13:17:27.641288] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xc2e3f0 PMD being used: compress_qat 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:47.062 13:17:27 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:48.036 00:08:48.036 real 0m1.771s 00:08:48.036 user 0m1.431s 00:08:48.036 sys 0m0.332s 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:48.036 13:17:28 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:08:48.036 ************************************ 00:08:48.036 END TEST accel_cdev_decomp 00:08:48.036 ************************************ 00:08:48.036 13:17:28 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:48.036 13:17:28 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:48.036 13:17:28 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:48.036 13:17:28 accel -- common/autotest_common.sh@10 -- # set +x 00:08:48.036 ************************************ 00:08:48.036 START TEST accel_cdev_decomp_full 00:08:48.036 ************************************ 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:08:48.036 13:17:28 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:08:48.297 [2024-07-25 13:17:28.844126] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:48.297 [2024-07-25 13:17:28.844190] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid843992 ] 00:08:48.297 [2024-07-25 13:17:28.934294] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:48.297 [2024-07-25 13:17:29.010775] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:48.866 [2024-07-25 13:17:29.408046] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:48.866 [2024-07-25 13:17:29.409844] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x128f6a0 PMD being used: compress_qat 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.866 [2024-07-25 13:17:29.412165] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1292970 PMD being used: compress_qat 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.866 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:48.867 13:17:29 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:49.808 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:49.809 00:08:49.809 real 0m1.701s 00:08:49.809 user 0m1.386s 00:08:49.809 sys 0m0.311s 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:49.809 13:17:30 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:08:49.809 ************************************ 00:08:49.809 END TEST accel_cdev_decomp_full 00:08:49.809 ************************************ 00:08:49.809 13:17:30 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:49.809 13:17:30 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:49.809 13:17:30 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:49.809 13:17:30 accel -- common/autotest_common.sh@10 -- # set +x 00:08:49.809 ************************************ 00:08:49.809 START TEST accel_cdev_decomp_mcore 00:08:49.809 ************************************ 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:49.809 13:17:30 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:50.070 [2024-07-25 13:17:30.617080] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:50.070 [2024-07-25 13:17:30.617135] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid844317 ] 00:08:50.070 [2024-07-25 13:17:30.705659] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:50.070 [2024-07-25 13:17:30.773895] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:50.070 [2024-07-25 13:17:30.774040] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:50.070 [2024-07-25 13:17:30.774179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:50.070 [2024-07-25 13:17:30.774179] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:50.640 [2024-07-25 13:17:31.170881] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:50.640 [2024-07-25 13:17:31.172626] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20e2cc0 PMD being used: compress_qat 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 [2024-07-25 13:17:31.177164] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f450c19b8b0 PMD being used: compress_qat 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 [2024-07-25 13:17:31.178527] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x20e7f50 PMD being used: compress_qat 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:50.640 [2024-07-25 13:17:31.184233] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f450419b8b0 PMD being used: compress_qat 00:08:50.640 [2024-07-25 13:17:31.184394] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f44fc19b8b0 PMD being used: compress_qat 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:50.640 13:17:31 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.581 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:51.582 00:08:51.582 real 0m1.712s 00:08:51.582 user 0m5.784s 00:08:51.582 sys 0m0.302s 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:51.582 13:17:32 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:51.582 ************************************ 00:08:51.582 END TEST accel_cdev_decomp_mcore 00:08:51.582 ************************************ 00:08:51.582 13:17:32 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:51.582 13:17:32 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:51.582 13:17:32 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:51.582 13:17:32 accel -- common/autotest_common.sh@10 -- # set +x 00:08:51.582 ************************************ 00:08:51.582 START TEST accel_cdev_decomp_full_mcore 00:08:51.582 ************************************ 00:08:51.582 13:17:32 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:51.582 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:08:51.842 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:08:51.842 [2024-07-25 13:17:32.405841] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:51.842 [2024-07-25 13:17:32.405897] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid844640 ] 00:08:51.842 [2024-07-25 13:17:32.494806] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:08:51.842 [2024-07-25 13:17:32.565137] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:51.842 [2024-07-25 13:17:32.565255] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:51.842 [2024-07-25 13:17:32.565396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:51.842 [2024-07-25 13:17:32.565396] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 3 00:08:52.413 [2024-07-25 13:17:32.962733] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:52.413 [2024-07-25 13:17:32.964487] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfd3cc0 PMD being used: compress_qat 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 [2024-07-25 13:17:32.968067] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff30019b8b0 PMD being used: compress_qat 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 [2024-07-25 13:17:32.969371] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xfd3d60 PMD being used: compress_qat 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.413 [2024-07-25 13:17:32.974870] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff2f819b8b0 PMD being used: compress_qat 00:08:52.413 [2024-07-25 13:17:32.975044] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7ff2f019b8b0 PMD being used: compress_qat 00:08:52.413 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:52.414 13:17:32 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:53.358 00:08:53.358 real 0m1.722s 00:08:53.358 user 0m5.836s 00:08:53.358 sys 0m0.286s 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:53.358 13:17:34 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:08:53.358 ************************************ 00:08:53.358 END TEST accel_cdev_decomp_full_mcore 00:08:53.358 ************************************ 00:08:53.358 13:17:34 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:53.358 13:17:34 accel -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:08:53.358 13:17:34 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:53.358 13:17:34 accel -- common/autotest_common.sh@10 -- # set +x 00:08:53.618 ************************************ 00:08:53.618 START TEST accel_cdev_decomp_mthread 00:08:53.618 ************************************ 00:08:53.618 13:17:34 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:53.618 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:53.618 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:53.619 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:53.619 [2024-07-25 13:17:34.202511] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:53.619 [2024-07-25 13:17:34.202579] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid844967 ] 00:08:53.619 [2024-07-25 13:17:34.289176] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:53.619 [2024-07-25 13:17:34.357954] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:54.189 [2024-07-25 13:17:34.757510] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:54.189 [2024-07-25 13:17:34.759286] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17a36a0 PMD being used: compress_qat 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 [2024-07-25 13:17:34.762673] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x17a8840 PMD being used: compress_qat 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 [2024-07-25 13:17:34.764411] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x18cb320 PMD being used: compress_qat 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:54.189 13:17:34 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:55.132 00:08:55.132 real 0m1.697s 00:08:55.132 user 0m1.406s 00:08:55.132 sys 0m0.292s 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:55.132 13:17:35 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:55.132 ************************************ 00:08:55.132 END TEST accel_cdev_decomp_mthread 00:08:55.132 ************************************ 00:08:55.132 13:17:35 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:55.132 13:17:35 accel -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:08:55.132 13:17:35 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:55.132 13:17:35 accel -- common/autotest_common.sh@10 -- # set +x 00:08:55.393 ************************************ 00:08:55.393 START TEST accel_cdev_decomp_full_mthread 00:08:55.393 ************************************ 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:08:55.393 13:17:35 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:08:55.393 [2024-07-25 13:17:35.979897] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:55.393 [2024-07-25 13:17:35.980011] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid845295 ] 00:08:55.393 [2024-07-25 13:17:36.079751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:55.393 [2024-07-25 13:17:36.156625] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:55.964 [2024-07-25 13:17:36.550260] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:08:55.964 [2024-07-25 13:17:36.552023] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x180a6a0 PMD being used: compress_qat 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 [2024-07-25 13:17:36.554560] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x180a740 PMD being used: compress_qat 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 [2024-07-25 13:17:36.556434] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a0f2d0 PMD being used: compress_qat 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.964 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.965 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.965 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:55.965 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:55.965 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:55.965 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:55.965 13:17:36 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:08:56.905 00:08:56.905 real 0m1.718s 00:08:56.905 user 0m1.404s 00:08:56.905 sys 0m0.316s 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:56.905 13:17:37 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:08:56.905 ************************************ 00:08:56.905 END TEST accel_cdev_decomp_full_mthread 00:08:56.905 ************************************ 00:08:57.166 13:17:37 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:08:57.166 13:17:37 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:57.166 13:17:37 accel -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:08:57.166 13:17:37 accel -- accel/accel.sh@137 -- # build_accel_config 00:08:57.166 13:17:37 accel -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:57.166 13:17:37 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:08:57.166 13:17:37 accel -- common/autotest_common.sh@10 -- # set +x 00:08:57.166 13:17:37 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:08:57.166 13:17:37 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:08:57.166 13:17:37 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:08:57.166 13:17:37 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:08:57.166 13:17:37 accel -- accel/accel.sh@40 -- # local IFS=, 00:08:57.166 13:17:37 accel -- accel/accel.sh@41 -- # jq -r . 00:08:57.166 ************************************ 00:08:57.166 START TEST accel_dif_functional_tests 00:08:57.166 ************************************ 00:08:57.166 13:17:37 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:08:57.166 [2024-07-25 13:17:37.792671] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:57.166 [2024-07-25 13:17:37.792715] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid845620 ] 00:08:57.166 [2024-07-25 13:17:37.882209] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:08:57.166 [2024-07-25 13:17:37.956514] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:08:57.166 [2024-07-25 13:17:37.956651] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:08:57.166 [2024-07-25 13:17:37.956815] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:57.426 00:08:57.426 00:08:57.426 CUnit - A unit testing framework for C - Version 2.1-3 00:08:57.426 http://cunit.sourceforge.net/ 00:08:57.426 00:08:57.426 00:08:57.426 Suite: accel_dif 00:08:57.426 Test: verify: DIF generated, GUARD check ...passed 00:08:57.426 Test: verify: DIF generated, APPTAG check ...passed 00:08:57.426 Test: verify: DIF generated, REFTAG check ...passed 00:08:57.426 Test: verify: DIF not generated, GUARD check ...[2024-07-25 13:17:38.024130] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:57.426 passed 00:08:57.426 Test: verify: DIF not generated, APPTAG check ...[2024-07-25 13:17:38.024184] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:57.426 passed 00:08:57.426 Test: verify: DIF not generated, REFTAG check ...[2024-07-25 13:17:38.024216] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:57.426 passed 00:08:57.426 Test: verify: APPTAG correct, APPTAG check ...passed 00:08:57.426 Test: verify: APPTAG incorrect, APPTAG check ...[2024-07-25 13:17:38.024272] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:08:57.426 passed 00:08:57.426 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:08:57.426 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:08:57.426 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:08:57.426 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-07-25 13:17:38.024403] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:08:57.426 passed 00:08:57.426 Test: verify copy: DIF generated, GUARD check ...passed 00:08:57.426 Test: verify copy: DIF generated, APPTAG check ...passed 00:08:57.426 Test: verify copy: DIF generated, REFTAG check ...passed 00:08:57.427 Test: verify copy: DIF not generated, GUARD check ...[2024-07-25 13:17:38.024551] dif.c: 861:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:08:57.427 passed 00:08:57.427 Test: verify copy: DIF not generated, APPTAG check ...[2024-07-25 13:17:38.024581] dif.c: 876:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:08:57.427 passed 00:08:57.427 Test: verify copy: DIF not generated, REFTAG check ...[2024-07-25 13:17:38.024607] dif.c: 811:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:08:57.427 passed 00:08:57.427 Test: generate copy: DIF generated, GUARD check ...passed 00:08:57.427 Test: generate copy: DIF generated, APTTAG check ...passed 00:08:57.427 Test: generate copy: DIF generated, REFTAG check ...passed 00:08:57.427 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:08:57.427 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:08:57.427 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:08:57.427 Test: generate copy: iovecs-len validate ...[2024-07-25 13:17:38.024820] dif.c:1225:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:08:57.427 passed 00:08:57.427 Test: generate copy: buffer alignment validate ...passed 00:08:57.427 00:08:57.427 Run Summary: Type Total Ran Passed Failed Inactive 00:08:57.427 suites 1 1 n/a 0 0 00:08:57.427 tests 26 26 26 0 0 00:08:57.427 asserts 115 115 115 0 n/a 00:08:57.427 00:08:57.427 Elapsed time = 0.002 seconds 00:08:57.427 00:08:57.427 real 0m0.399s 00:08:57.427 user 0m0.519s 00:08:57.427 sys 0m0.154s 00:08:57.427 13:17:38 accel.accel_dif_functional_tests -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:57.427 13:17:38 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:08:57.427 ************************************ 00:08:57.427 END TEST accel_dif_functional_tests 00:08:57.427 ************************************ 00:08:57.427 00:08:57.427 real 0m45.652s 00:08:57.427 user 0m55.149s 00:08:57.427 sys 0m7.913s 00:08:57.427 13:17:38 accel -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:57.427 13:17:38 accel -- common/autotest_common.sh@10 -- # set +x 00:08:57.427 ************************************ 00:08:57.427 END TEST accel 00:08:57.427 ************************************ 00:08:57.427 13:17:38 -- spdk/autotest.sh@186 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:57.427 13:17:38 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:57.427 13:17:38 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:57.427 13:17:38 -- common/autotest_common.sh@10 -- # set +x 00:08:57.687 ************************************ 00:08:57.687 START TEST accel_rpc 00:08:57.687 ************************************ 00:08:57.687 13:17:38 accel_rpc -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:08:57.687 * Looking for test storage... 00:08:57.687 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:08:57.687 13:17:38 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:08:57.687 13:17:38 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=845685 00:08:57.687 13:17:38 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 845685 00:08:57.687 13:17:38 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:08:57.687 13:17:38 accel_rpc -- common/autotest_common.sh@831 -- # '[' -z 845685 ']' 00:08:57.687 13:17:38 accel_rpc -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:57.687 13:17:38 accel_rpc -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:57.687 13:17:38 accel_rpc -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:57.687 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:57.687 13:17:38 accel_rpc -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:57.687 13:17:38 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:57.687 [2024-07-25 13:17:38.414096] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:57.687 [2024-07-25 13:17:38.414150] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid845685 ] 00:08:57.947 [2024-07-25 13:17:38.502405] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:57.947 [2024-07-25 13:17:38.566036] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:08:58.518 13:17:39 accel_rpc -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:08:58.518 13:17:39 accel_rpc -- common/autotest_common.sh@864 -- # return 0 00:08:58.518 13:17:39 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:08:58.518 13:17:39 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:08:58.518 13:17:39 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:08:58.518 13:17:39 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:08:58.518 13:17:39 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:08:58.518 13:17:39 accel_rpc -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:58.518 13:17:39 accel_rpc -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:58.518 13:17:39 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:58.518 ************************************ 00:08:58.518 START TEST accel_assign_opcode 00:08:58.518 ************************************ 00:08:58.518 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # accel_assign_opcode_test_suite 00:08:58.518 13:17:39 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:08:58.518 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.519 [2024-07-25 13:17:39.284069] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.519 [2024-07-25 13:17:39.292083] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:58.519 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@561 -- # xtrace_disable 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:08:58.782 software 00:08:58.782 00:08:58.782 real 0m0.221s 00:08:58.782 user 0m0.051s 00:08:58.782 sys 0m0.009s 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:58.782 13:17:39 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:08:58.782 ************************************ 00:08:58.782 END TEST accel_assign_opcode 00:08:58.782 ************************************ 00:08:58.782 13:17:39 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 845685 00:08:58.782 13:17:39 accel_rpc -- common/autotest_common.sh@950 -- # '[' -z 845685 ']' 00:08:58.782 13:17:39 accel_rpc -- common/autotest_common.sh@954 -- # kill -0 845685 00:08:58.782 13:17:39 accel_rpc -- common/autotest_common.sh@955 -- # uname 00:08:58.782 13:17:39 accel_rpc -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:08:58.782 13:17:39 accel_rpc -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 845685 00:08:59.044 13:17:39 accel_rpc -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:08:59.044 13:17:39 accel_rpc -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:08:59.044 13:17:39 accel_rpc -- common/autotest_common.sh@968 -- # echo 'killing process with pid 845685' 00:08:59.044 killing process with pid 845685 00:08:59.044 13:17:39 accel_rpc -- common/autotest_common.sh@969 -- # kill 845685 00:08:59.044 13:17:39 accel_rpc -- common/autotest_common.sh@974 -- # wait 845685 00:08:59.044 00:08:59.044 real 0m1.533s 00:08:59.044 user 0m1.657s 00:08:59.044 sys 0m0.414s 00:08:59.044 13:17:39 accel_rpc -- common/autotest_common.sh@1126 -- # xtrace_disable 00:08:59.044 13:17:39 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:08:59.044 ************************************ 00:08:59.044 END TEST accel_rpc 00:08:59.044 ************************************ 00:08:59.044 13:17:39 -- spdk/autotest.sh@189 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:59.044 13:17:39 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:08:59.044 13:17:39 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:08:59.044 13:17:39 -- common/autotest_common.sh@10 -- # set +x 00:08:59.305 ************************************ 00:08:59.305 START TEST app_cmdline 00:08:59.305 ************************************ 00:08:59.305 13:17:39 app_cmdline -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:08:59.305 * Looking for test storage... 00:08:59.305 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:08:59.305 13:17:39 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:08:59.305 13:17:39 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=846065 00:08:59.305 13:17:39 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 846065 00:08:59.305 13:17:39 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:08:59.305 13:17:39 app_cmdline -- common/autotest_common.sh@831 -- # '[' -z 846065 ']' 00:08:59.305 13:17:39 app_cmdline -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:59.305 13:17:39 app_cmdline -- common/autotest_common.sh@836 -- # local max_retries=100 00:08:59.305 13:17:39 app_cmdline -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:59.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:59.305 13:17:39 app_cmdline -- common/autotest_common.sh@840 -- # xtrace_disable 00:08:59.305 13:17:39 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:08:59.305 [2024-07-25 13:17:40.035217] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:08:59.305 [2024-07-25 13:17:40.035279] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid846065 ] 00:08:59.565 [2024-07-25 13:17:40.125905] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:59.565 [2024-07-25 13:17:40.191407] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:00.136 13:17:40 app_cmdline -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:00.136 13:17:40 app_cmdline -- common/autotest_common.sh@864 -- # return 0 00:09:00.136 13:17:40 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:09:00.396 { 00:09:00.396 "version": "SPDK v24.09-pre git sha1 704257090", 00:09:00.396 "fields": { 00:09:00.396 "major": 24, 00:09:00.396 "minor": 9, 00:09:00.396 "patch": 0, 00:09:00.396 "suffix": "-pre", 00:09:00.396 "commit": "704257090" 00:09:00.396 } 00:09:00.396 } 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@26 -- # sort 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:09:00.396 13:17:41 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@650 -- # local es=0 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:09:00.396 13:17:41 app_cmdline -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:09:00.656 request: 00:09:00.656 { 00:09:00.656 "method": "env_dpdk_get_mem_stats", 00:09:00.656 "req_id": 1 00:09:00.656 } 00:09:00.656 Got JSON-RPC error response 00:09:00.656 response: 00:09:00.656 { 00:09:00.656 "code": -32601, 00:09:00.656 "message": "Method not found" 00:09:00.656 } 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@653 -- # es=1 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:09:00.656 13:17:41 app_cmdline -- app/cmdline.sh@1 -- # killprocess 846065 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@950 -- # '[' -z 846065 ']' 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@954 -- # kill -0 846065 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@955 -- # uname 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 846065 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@968 -- # echo 'killing process with pid 846065' 00:09:00.656 killing process with pid 846065 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@969 -- # kill 846065 00:09:00.656 13:17:41 app_cmdline -- common/autotest_common.sh@974 -- # wait 846065 00:09:00.917 00:09:00.917 real 0m1.680s 00:09:00.917 user 0m2.092s 00:09:00.917 sys 0m0.421s 00:09:00.917 13:17:41 app_cmdline -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:00.917 13:17:41 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:09:00.917 ************************************ 00:09:00.917 END TEST app_cmdline 00:09:00.917 ************************************ 00:09:00.917 13:17:41 -- spdk/autotest.sh@190 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:00.917 13:17:41 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:00.917 13:17:41 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:00.917 13:17:41 -- common/autotest_common.sh@10 -- # set +x 00:09:00.917 ************************************ 00:09:00.917 START TEST version 00:09:00.917 ************************************ 00:09:00.917 13:17:41 version -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:09:00.917 * Looking for test storage... 00:09:01.177 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:09:01.177 13:17:41 version -- app/version.sh@17 -- # get_header_version major 00:09:01.177 13:17:41 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # cut -f2 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # tr -d '"' 00:09:01.177 13:17:41 version -- app/version.sh@17 -- # major=24 00:09:01.177 13:17:41 version -- app/version.sh@18 -- # get_header_version minor 00:09:01.177 13:17:41 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # cut -f2 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # tr -d '"' 00:09:01.177 13:17:41 version -- app/version.sh@18 -- # minor=9 00:09:01.177 13:17:41 version -- app/version.sh@19 -- # get_header_version patch 00:09:01.177 13:17:41 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # cut -f2 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # tr -d '"' 00:09:01.177 13:17:41 version -- app/version.sh@19 -- # patch=0 00:09:01.177 13:17:41 version -- app/version.sh@20 -- # get_header_version suffix 00:09:01.177 13:17:41 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # cut -f2 00:09:01.177 13:17:41 version -- app/version.sh@14 -- # tr -d '"' 00:09:01.177 13:17:41 version -- app/version.sh@20 -- # suffix=-pre 00:09:01.177 13:17:41 version -- app/version.sh@22 -- # version=24.9 00:09:01.177 13:17:41 version -- app/version.sh@25 -- # (( patch != 0 )) 00:09:01.177 13:17:41 version -- app/version.sh@28 -- # version=24.9rc0 00:09:01.177 13:17:41 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:09:01.177 13:17:41 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:09:01.177 13:17:41 version -- app/version.sh@30 -- # py_version=24.9rc0 00:09:01.177 13:17:41 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:09:01.177 00:09:01.177 real 0m0.184s 00:09:01.177 user 0m0.100s 00:09:01.177 sys 0m0.128s 00:09:01.177 13:17:41 version -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:01.177 13:17:41 version -- common/autotest_common.sh@10 -- # set +x 00:09:01.177 ************************************ 00:09:01.177 END TEST version 00:09:01.177 ************************************ 00:09:01.177 13:17:41 -- spdk/autotest.sh@192 -- # '[' 1 -eq 1 ']' 00:09:01.177 13:17:41 -- spdk/autotest.sh@193 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:01.177 13:17:41 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:09:01.177 13:17:41 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:01.177 13:17:41 -- common/autotest_common.sh@10 -- # set +x 00:09:01.177 ************************************ 00:09:01.177 START TEST blockdev_general 00:09:01.177 ************************************ 00:09:01.177 13:17:41 blockdev_general -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:09:01.177 * Looking for test storage... 00:09:01.438 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:01.438 13:17:41 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@673 -- # uname -s 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@681 -- # test_type=bdev 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@682 -- # crypto_device= 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@683 -- # dek= 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@684 -- # env_ctx= 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@689 -- # [[ bdev == bdev ]] 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=846491 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 846491 00:09:01.438 13:17:41 blockdev_general -- common/autotest_common.sh@831 -- # '[' -z 846491 ']' 00:09:01.438 13:17:41 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:09:01.438 13:17:41 blockdev_general -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:01.438 13:17:41 blockdev_general -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:01.438 13:17:41 blockdev_general -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:01.438 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:01.438 13:17:41 blockdev_general -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:01.438 13:17:41 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:01.438 [2024-07-25 13:17:42.055087] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:09:01.438 [2024-07-25 13:17:42.055146] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid846491 ] 00:09:01.438 [2024-07-25 13:17:42.148186] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:01.438 [2024-07-25 13:17:42.216446] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:02.378 13:17:42 blockdev_general -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:02.378 13:17:42 blockdev_general -- common/autotest_common.sh@864 -- # return 0 00:09:02.378 13:17:42 blockdev_general -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:09:02.378 13:17:42 blockdev_general -- bdev/blockdev.sh@695 -- # setup_bdev_conf 00:09:02.378 13:17:42 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:09:02.378 13:17:42 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.378 13:17:42 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.378 [2024-07-25 13:17:43.067402] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.378 [2024-07-25 13:17:43.067446] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:02.378 00:09:02.378 [2024-07-25 13:17:43.075397] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.378 [2024-07-25 13:17:43.075415] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:02.378 00:09:02.378 Malloc0 00:09:02.378 Malloc1 00:09:02.378 Malloc2 00:09:02.378 Malloc3 00:09:02.378 Malloc4 00:09:02.378 Malloc5 00:09:02.378 Malloc6 00:09:02.378 Malloc7 00:09:02.639 Malloc8 00:09:02.639 Malloc9 00:09:02.639 [2024-07-25 13:17:43.184412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:02.639 [2024-07-25 13:17:43.184447] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:02.639 [2024-07-25 13:17:43.184459] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fc1dd0 00:09:02.639 [2024-07-25 13:17:43.184465] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:02.639 [2024-07-25 13:17:43.185592] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:02.640 [2024-07-25 13:17:43.185612] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:02.640 TestPT 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:09:02.640 5000+0 records in 00:09:02.640 5000+0 records out 00:09:02.640 10240000 bytes (10 MB, 9.8 MiB) copied, 0.00673243 s, 1.5 GB/s 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.640 AIO0 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@739 -- # cat 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:09:02.640 13:17:43 blockdev_general -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@561 -- # xtrace_disable 00:09:02.640 13:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:09:02.900 13:17:43 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:09:02.900 13:17:43 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r .name 00:09:02.900 13:17:43 blockdev_general -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "662766cf-a329-4b13-8bbf-beb7530fad19"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "662766cf-a329-4b13-8bbf-beb7530fad19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "8499c77c-0f29-5e8a-8896-2004e873e147"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8499c77c-0f29-5e8a-8896-2004e873e147",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "7a3a2d7f-485d-5e9f-893e-9b0f4d59dc62"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7a3a2d7f-485d-5e9f-893e-9b0f4d59dc62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f37f913b-e042-5a4b-8baf-39ef74035b81"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f37f913b-e042-5a4b-8baf-39ef74035b81",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "e8b18e7a-7da5-5fa0-8b5f-ab69b95611fd"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e8b18e7a-7da5-5fa0-8b5f-ab69b95611fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "d7540043-92ff-5f7a-998d-91853a2b9c68"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d7540043-92ff-5f7a-998d-91853a2b9c68",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "11162537-5c21-5fc0-95a0-e3b5fc1714c9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11162537-5c21-5fc0-95a0-e3b5fc1714c9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b7cfacdd-932e-5978-8bfd-0b0b57b936b1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b7cfacdd-932e-5978-8bfd-0b0b57b936b1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "7cfbd5e3-03d8-55ab-a794-7e265fe1ec0e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7cfbd5e3-03d8-55ab-a794-7e265fe1ec0e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "41c854c2-376d-5540-938f-6c103f9acba1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "41c854c2-376d-5540-938f-6c103f9acba1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b70a9f28-ebd6-565c-a639-8069043f8871"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b70a9f28-ebd6-565c-a639-8069043f8871",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f65e4131-bbc3-5b92-92e2-60bbb5306484"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f65e4131-bbc3-5b92-92e2-60bbb5306484",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "9ff378bc-36fb-48cf-83cf-be3bb4f804cb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "9ff378bc-36fb-48cf-83cf-be3bb4f804cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9ff378bc-36fb-48cf-83cf-be3bb4f804cb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5c832d3d-b2d2-4dd8-b3d8-3fadcabe7ea4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5e0865da-3075-428a-9538-df4fb2ece21d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "d2995536-33da-47ba-ae0d-bbe563b5ee92"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d2995536-33da-47ba-ae0d-bbe563b5ee92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d2995536-33da-47ba-ae0d-bbe563b5ee92",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "7e5009b0-a0a0-4e3c-b6e7-fdf6341a30c4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "06d99b35-06ec-4c4c-acc5-2eb5d368ef4a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "893433b6-0433-4c8b-aa5e-c9ef348563b6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "9f1270e8-d25a-42d6-8b71-baa792c5685b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "b6ec2202-c66a-4eb8-92c9-20ef283bbe95"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "b6ec2202-c66a-4eb8-92c9-20ef283bbe95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:02.900 13:17:43 blockdev_general -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:09:02.900 13:17:43 blockdev_general -- bdev/blockdev.sh@751 -- # hello_world_bdev=Malloc0 00:09:02.900 13:17:43 blockdev_general -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:09:02.900 13:17:43 blockdev_general -- bdev/blockdev.sh@753 -- # killprocess 846491 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@950 -- # '[' -z 846491 ']' 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@954 -- # kill -0 846491 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@955 -- # uname 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 846491 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@968 -- # echo 'killing process with pid 846491' 00:09:02.900 killing process with pid 846491 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@969 -- # kill 846491 00:09:02.900 13:17:43 blockdev_general -- common/autotest_common.sh@974 -- # wait 846491 00:09:03.160 13:17:43 blockdev_general -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:03.160 13:17:43 blockdev_general -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:03.160 13:17:43 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:09:03.160 13:17:43 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:03.160 13:17:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:03.160 ************************************ 00:09:03.160 START TEST bdev_hello_world 00:09:03.160 ************************************ 00:09:03.160 13:17:43 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:09:03.160 [2024-07-25 13:17:43.926480] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:09:03.160 [2024-07-25 13:17:43.926527] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid846821 ] 00:09:03.420 [2024-07-25 13:17:44.012475] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:03.420 [2024-07-25 13:17:44.076323] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:03.420 [2024-07-25 13:17:44.197715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:03.420 [2024-07-25 13:17:44.197758] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:03.420 [2024-07-25 13:17:44.197767] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:03.420 [2024-07-25 13:17:44.205720] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:03.420 [2024-07-25 13:17:44.205739] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:03.681 [2024-07-25 13:17:44.213733] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:03.681 [2024-07-25 13:17:44.213750] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:03.681 [2024-07-25 13:17:44.274774] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:03.681 [2024-07-25 13:17:44.274812] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:03.681 [2024-07-25 13:17:44.274822] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1817c70 00:09:03.681 [2024-07-25 13:17:44.274833] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:03.681 [2024-07-25 13:17:44.275972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:03.681 [2024-07-25 13:17:44.275991] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:03.681 [2024-07-25 13:17:44.406911] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:09:03.681 [2024-07-25 13:17:44.406955] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:09:03.681 [2024-07-25 13:17:44.406985] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:09:03.681 [2024-07-25 13:17:44.407024] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:09:03.681 [2024-07-25 13:17:44.407073] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:09:03.681 [2024-07-25 13:17:44.407087] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:09:03.681 [2024-07-25 13:17:44.407123] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:09:03.681 00:09:03.681 [2024-07-25 13:17:44.407143] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:09:03.941 00:09:03.941 real 0m0.706s 00:09:03.941 user 0m0.461s 00:09:03.941 sys 0m0.201s 00:09:03.941 13:17:44 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:03.941 13:17:44 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:09:03.941 ************************************ 00:09:03.941 END TEST bdev_hello_world 00:09:03.941 ************************************ 00:09:03.941 13:17:44 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:09:03.941 13:17:44 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:03.941 13:17:44 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:03.941 13:17:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:03.941 ************************************ 00:09:03.941 START TEST bdev_bounds 00:09:03.941 ************************************ 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=847039 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 847039' 00:09:03.941 Process bdevio pid: 847039 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 847039 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 847039 ']' 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:03.941 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:03.941 13:17:44 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:03.941 [2024-07-25 13:17:44.712803] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:09:03.942 [2024-07-25 13:17:44.712853] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid847039 ] 00:09:04.202 [2024-07-25 13:17:44.804921] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:09:04.202 [2024-07-25 13:17:44.884297] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:04.202 [2024-07-25 13:17:44.884443] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:04.202 [2024-07-25 13:17:44.884443] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:09:04.462 [2024-07-25 13:17:45.008287] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:04.462 [2024-07-25 13:17:45.008325] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:04.462 [2024-07-25 13:17:45.008334] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:04.462 [2024-07-25 13:17:45.016293] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.462 [2024-07-25 13:17:45.016312] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:04.462 [2024-07-25 13:17:45.024311] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.462 [2024-07-25 13:17:45.024327] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:04.462 [2024-07-25 13:17:45.085381] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:04.462 [2024-07-25 13:17:45.085420] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:04.462 [2024-07-25 13:17:45.085430] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1963f00 00:09:04.462 [2024-07-25 13:17:45.085437] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:04.462 [2024-07-25 13:17:45.086647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:04.462 [2024-07-25 13:17:45.086668] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:05.033 13:17:45 blockdev_general.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:05.033 13:17:45 blockdev_general.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:09:05.033 13:17:45 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:09:05.033 I/O targets: 00:09:05.033 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:09:05.033 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:09:05.033 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:09:05.033 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:09:05.033 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:09:05.033 raid0: 131072 blocks of 512 bytes (64 MiB) 00:09:05.033 concat0: 131072 blocks of 512 bytes (64 MiB) 00:09:05.033 raid1: 65536 blocks of 512 bytes (32 MiB) 00:09:05.033 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:09:05.033 00:09:05.033 00:09:05.033 CUnit - A unit testing framework for C - Version 2.1-3 00:09:05.033 http://cunit.sourceforge.net/ 00:09:05.033 00:09:05.033 00:09:05.033 Suite: bdevio tests on: AIO0 00:09:05.033 Test: blockdev write read block ...passed 00:09:05.033 Test: blockdev write zeroes read block ...passed 00:09:05.033 Test: blockdev write zeroes read no split ...passed 00:09:05.033 Test: blockdev write zeroes read split ...passed 00:09:05.033 Test: blockdev write zeroes read split partial ...passed 00:09:05.033 Test: blockdev reset ...passed 00:09:05.033 Test: blockdev write read 8 blocks ...passed 00:09:05.033 Test: blockdev write read size > 128k ...passed 00:09:05.033 Test: blockdev write read invalid size ...passed 00:09:05.033 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.033 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.033 Test: blockdev write read max offset ...passed 00:09:05.033 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.033 Test: blockdev writev readv 8 blocks ...passed 00:09:05.033 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.033 Test: blockdev writev readv block ...passed 00:09:05.033 Test: blockdev writev readv size > 128k ...passed 00:09:05.033 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.033 Test: blockdev comparev and writev ...passed 00:09:05.033 Test: blockdev nvme passthru rw ...passed 00:09:05.033 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.033 Test: blockdev nvme admin passthru ...passed 00:09:05.033 Test: blockdev copy ...passed 00:09:05.033 Suite: bdevio tests on: raid1 00:09:05.033 Test: blockdev write read block ...passed 00:09:05.033 Test: blockdev write zeroes read block ...passed 00:09:05.033 Test: blockdev write zeroes read no split ...passed 00:09:05.033 Test: blockdev write zeroes read split ...passed 00:09:05.033 Test: blockdev write zeroes read split partial ...passed 00:09:05.033 Test: blockdev reset ...passed 00:09:05.033 Test: blockdev write read 8 blocks ...passed 00:09:05.033 Test: blockdev write read size > 128k ...passed 00:09:05.033 Test: blockdev write read invalid size ...passed 00:09:05.033 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.033 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.033 Test: blockdev write read max offset ...passed 00:09:05.033 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.033 Test: blockdev writev readv 8 blocks ...passed 00:09:05.033 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.033 Test: blockdev writev readv block ...passed 00:09:05.033 Test: blockdev writev readv size > 128k ...passed 00:09:05.033 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.034 Test: blockdev comparev and writev ...passed 00:09:05.034 Test: blockdev nvme passthru rw ...passed 00:09:05.034 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.034 Test: blockdev nvme admin passthru ...passed 00:09:05.034 Test: blockdev copy ...passed 00:09:05.034 Suite: bdevio tests on: concat0 00:09:05.034 Test: blockdev write read block ...passed 00:09:05.034 Test: blockdev write zeroes read block ...passed 00:09:05.034 Test: blockdev write zeroes read no split ...passed 00:09:05.034 Test: blockdev write zeroes read split ...passed 00:09:05.034 Test: blockdev write zeroes read split partial ...passed 00:09:05.034 Test: blockdev reset ...passed 00:09:05.034 Test: blockdev write read 8 blocks ...passed 00:09:05.034 Test: blockdev write read size > 128k ...passed 00:09:05.034 Test: blockdev write read invalid size ...passed 00:09:05.034 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.034 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.034 Test: blockdev write read max offset ...passed 00:09:05.034 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.034 Test: blockdev writev readv 8 blocks ...passed 00:09:05.034 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.034 Test: blockdev writev readv block ...passed 00:09:05.034 Test: blockdev writev readv size > 128k ...passed 00:09:05.034 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.034 Test: blockdev comparev and writev ...passed 00:09:05.034 Test: blockdev nvme passthru rw ...passed 00:09:05.034 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.034 Test: blockdev nvme admin passthru ...passed 00:09:05.034 Test: blockdev copy ...passed 00:09:05.034 Suite: bdevio tests on: raid0 00:09:05.034 Test: blockdev write read block ...passed 00:09:05.034 Test: blockdev write zeroes read block ...passed 00:09:05.034 Test: blockdev write zeroes read no split ...passed 00:09:05.034 Test: blockdev write zeroes read split ...passed 00:09:05.034 Test: blockdev write zeroes read split partial ...passed 00:09:05.034 Test: blockdev reset ...passed 00:09:05.034 Test: blockdev write read 8 blocks ...passed 00:09:05.034 Test: blockdev write read size > 128k ...passed 00:09:05.034 Test: blockdev write read invalid size ...passed 00:09:05.034 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.034 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.034 Test: blockdev write read max offset ...passed 00:09:05.034 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.034 Test: blockdev writev readv 8 blocks ...passed 00:09:05.034 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.034 Test: blockdev writev readv block ...passed 00:09:05.034 Test: blockdev writev readv size > 128k ...passed 00:09:05.034 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.034 Test: blockdev comparev and writev ...passed 00:09:05.034 Test: blockdev nvme passthru rw ...passed 00:09:05.034 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.034 Test: blockdev nvme admin passthru ...passed 00:09:05.034 Test: blockdev copy ...passed 00:09:05.034 Suite: bdevio tests on: TestPT 00:09:05.034 Test: blockdev write read block ...passed 00:09:05.034 Test: blockdev write zeroes read block ...passed 00:09:05.034 Test: blockdev write zeroes read no split ...passed 00:09:05.034 Test: blockdev write zeroes read split ...passed 00:09:05.034 Test: blockdev write zeroes read split partial ...passed 00:09:05.034 Test: blockdev reset ...passed 00:09:05.034 Test: blockdev write read 8 blocks ...passed 00:09:05.034 Test: blockdev write read size > 128k ...passed 00:09:05.034 Test: blockdev write read invalid size ...passed 00:09:05.034 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.034 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.034 Test: blockdev write read max offset ...passed 00:09:05.034 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.034 Test: blockdev writev readv 8 blocks ...passed 00:09:05.034 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.034 Test: blockdev writev readv block ...passed 00:09:05.034 Test: blockdev writev readv size > 128k ...passed 00:09:05.034 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.034 Test: blockdev comparev and writev ...passed 00:09:05.034 Test: blockdev nvme passthru rw ...passed 00:09:05.034 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.034 Test: blockdev nvme admin passthru ...passed 00:09:05.034 Test: blockdev copy ...passed 00:09:05.034 Suite: bdevio tests on: Malloc2p7 00:09:05.034 Test: blockdev write read block ...passed 00:09:05.034 Test: blockdev write zeroes read block ...passed 00:09:05.034 Test: blockdev write zeroes read no split ...passed 00:09:05.034 Test: blockdev write zeroes read split ...passed 00:09:05.034 Test: blockdev write zeroes read split partial ...passed 00:09:05.034 Test: blockdev reset ...passed 00:09:05.034 Test: blockdev write read 8 blocks ...passed 00:09:05.034 Test: blockdev write read size > 128k ...passed 00:09:05.034 Test: blockdev write read invalid size ...passed 00:09:05.034 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.034 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.034 Test: blockdev write read max offset ...passed 00:09:05.034 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.034 Test: blockdev writev readv 8 blocks ...passed 00:09:05.034 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.034 Test: blockdev writev readv block ...passed 00:09:05.034 Test: blockdev writev readv size > 128k ...passed 00:09:05.034 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.034 Test: blockdev comparev and writev ...passed 00:09:05.034 Test: blockdev nvme passthru rw ...passed 00:09:05.034 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.034 Test: blockdev nvme admin passthru ...passed 00:09:05.034 Test: blockdev copy ...passed 00:09:05.034 Suite: bdevio tests on: Malloc2p6 00:09:05.034 Test: blockdev write read block ...passed 00:09:05.034 Test: blockdev write zeroes read block ...passed 00:09:05.034 Test: blockdev write zeroes read no split ...passed 00:09:05.034 Test: blockdev write zeroes read split ...passed 00:09:05.034 Test: blockdev write zeroes read split partial ...passed 00:09:05.034 Test: blockdev reset ...passed 00:09:05.034 Test: blockdev write read 8 blocks ...passed 00:09:05.034 Test: blockdev write read size > 128k ...passed 00:09:05.034 Test: blockdev write read invalid size ...passed 00:09:05.034 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.034 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.034 Test: blockdev write read max offset ...passed 00:09:05.034 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.034 Test: blockdev writev readv 8 blocks ...passed 00:09:05.034 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.034 Test: blockdev writev readv block ...passed 00:09:05.034 Test: blockdev writev readv size > 128k ...passed 00:09:05.034 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.034 Test: blockdev comparev and writev ...passed 00:09:05.034 Test: blockdev nvme passthru rw ...passed 00:09:05.034 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.034 Test: blockdev nvme admin passthru ...passed 00:09:05.034 Test: blockdev copy ...passed 00:09:05.034 Suite: bdevio tests on: Malloc2p5 00:09:05.034 Test: blockdev write read block ...passed 00:09:05.034 Test: blockdev write zeroes read block ...passed 00:09:05.034 Test: blockdev write zeroes read no split ...passed 00:09:05.034 Test: blockdev write zeroes read split ...passed 00:09:05.034 Test: blockdev write zeroes read split partial ...passed 00:09:05.034 Test: blockdev reset ...passed 00:09:05.034 Test: blockdev write read 8 blocks ...passed 00:09:05.034 Test: blockdev write read size > 128k ...passed 00:09:05.034 Test: blockdev write read invalid size ...passed 00:09:05.034 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.034 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.034 Test: blockdev write read max offset ...passed 00:09:05.034 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.034 Test: blockdev writev readv 8 blocks ...passed 00:09:05.034 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.034 Test: blockdev writev readv block ...passed 00:09:05.034 Test: blockdev writev readv size > 128k ...passed 00:09:05.034 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.295 Test: blockdev comparev and writev ...passed 00:09:05.295 Test: blockdev nvme passthru rw ...passed 00:09:05.295 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.295 Test: blockdev nvme admin passthru ...passed 00:09:05.295 Test: blockdev copy ...passed 00:09:05.295 Suite: bdevio tests on: Malloc2p4 00:09:05.295 Test: blockdev write read block ...passed 00:09:05.295 Test: blockdev write zeroes read block ...passed 00:09:05.295 Test: blockdev write zeroes read no split ...passed 00:09:05.295 Test: blockdev write zeroes read split ...passed 00:09:05.295 Test: blockdev write zeroes read split partial ...passed 00:09:05.295 Test: blockdev reset ...passed 00:09:05.295 Test: blockdev write read 8 blocks ...passed 00:09:05.295 Test: blockdev write read size > 128k ...passed 00:09:05.295 Test: blockdev write read invalid size ...passed 00:09:05.295 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.295 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.295 Test: blockdev write read max offset ...passed 00:09:05.295 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.295 Test: blockdev writev readv 8 blocks ...passed 00:09:05.295 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.295 Test: blockdev writev readv block ...passed 00:09:05.295 Test: blockdev writev readv size > 128k ...passed 00:09:05.295 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.295 Test: blockdev comparev and writev ...passed 00:09:05.295 Test: blockdev nvme passthru rw ...passed 00:09:05.295 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.295 Test: blockdev nvme admin passthru ...passed 00:09:05.295 Test: blockdev copy ...passed 00:09:05.295 Suite: bdevio tests on: Malloc2p3 00:09:05.295 Test: blockdev write read block ...passed 00:09:05.295 Test: blockdev write zeroes read block ...passed 00:09:05.295 Test: blockdev write zeroes read no split ...passed 00:09:05.295 Test: blockdev write zeroes read split ...passed 00:09:05.295 Test: blockdev write zeroes read split partial ...passed 00:09:05.295 Test: blockdev reset ...passed 00:09:05.295 Test: blockdev write read 8 blocks ...passed 00:09:05.295 Test: blockdev write read size > 128k ...passed 00:09:05.295 Test: blockdev write read invalid size ...passed 00:09:05.295 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.295 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.295 Test: blockdev write read max offset ...passed 00:09:05.295 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.295 Test: blockdev writev readv 8 blocks ...passed 00:09:05.295 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.295 Test: blockdev writev readv block ...passed 00:09:05.295 Test: blockdev writev readv size > 128k ...passed 00:09:05.295 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.295 Test: blockdev comparev and writev ...passed 00:09:05.295 Test: blockdev nvme passthru rw ...passed 00:09:05.295 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.295 Test: blockdev nvme admin passthru ...passed 00:09:05.295 Test: blockdev copy ...passed 00:09:05.295 Suite: bdevio tests on: Malloc2p2 00:09:05.295 Test: blockdev write read block ...passed 00:09:05.295 Test: blockdev write zeroes read block ...passed 00:09:05.295 Test: blockdev write zeroes read no split ...passed 00:09:05.295 Test: blockdev write zeroes read split ...passed 00:09:05.295 Test: blockdev write zeroes read split partial ...passed 00:09:05.295 Test: blockdev reset ...passed 00:09:05.295 Test: blockdev write read 8 blocks ...passed 00:09:05.295 Test: blockdev write read size > 128k ...passed 00:09:05.295 Test: blockdev write read invalid size ...passed 00:09:05.295 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.295 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.295 Test: blockdev write read max offset ...passed 00:09:05.295 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.295 Test: blockdev writev readv 8 blocks ...passed 00:09:05.295 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.295 Test: blockdev writev readv block ...passed 00:09:05.295 Test: blockdev writev readv size > 128k ...passed 00:09:05.295 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.295 Test: blockdev comparev and writev ...passed 00:09:05.295 Test: blockdev nvme passthru rw ...passed 00:09:05.295 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.295 Test: blockdev nvme admin passthru ...passed 00:09:05.295 Test: blockdev copy ...passed 00:09:05.295 Suite: bdevio tests on: Malloc2p1 00:09:05.295 Test: blockdev write read block ...passed 00:09:05.295 Test: blockdev write zeroes read block ...passed 00:09:05.295 Test: blockdev write zeroes read no split ...passed 00:09:05.295 Test: blockdev write zeroes read split ...passed 00:09:05.295 Test: blockdev write zeroes read split partial ...passed 00:09:05.295 Test: blockdev reset ...passed 00:09:05.295 Test: blockdev write read 8 blocks ...passed 00:09:05.295 Test: blockdev write read size > 128k ...passed 00:09:05.296 Test: blockdev write read invalid size ...passed 00:09:05.296 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.296 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.296 Test: blockdev write read max offset ...passed 00:09:05.296 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.296 Test: blockdev writev readv 8 blocks ...passed 00:09:05.296 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.296 Test: blockdev writev readv block ...passed 00:09:05.296 Test: blockdev writev readv size > 128k ...passed 00:09:05.296 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.296 Test: blockdev comparev and writev ...passed 00:09:05.296 Test: blockdev nvme passthru rw ...passed 00:09:05.296 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.296 Test: blockdev nvme admin passthru ...passed 00:09:05.296 Test: blockdev copy ...passed 00:09:05.296 Suite: bdevio tests on: Malloc2p0 00:09:05.296 Test: blockdev write read block ...passed 00:09:05.296 Test: blockdev write zeroes read block ...passed 00:09:05.296 Test: blockdev write zeroes read no split ...passed 00:09:05.296 Test: blockdev write zeroes read split ...passed 00:09:05.296 Test: blockdev write zeroes read split partial ...passed 00:09:05.296 Test: blockdev reset ...passed 00:09:05.296 Test: blockdev write read 8 blocks ...passed 00:09:05.296 Test: blockdev write read size > 128k ...passed 00:09:05.296 Test: blockdev write read invalid size ...passed 00:09:05.296 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.296 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.296 Test: blockdev write read max offset ...passed 00:09:05.296 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.296 Test: blockdev writev readv 8 blocks ...passed 00:09:05.296 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.296 Test: blockdev writev readv block ...passed 00:09:05.296 Test: blockdev writev readv size > 128k ...passed 00:09:05.296 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.296 Test: blockdev comparev and writev ...passed 00:09:05.296 Test: blockdev nvme passthru rw ...passed 00:09:05.296 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.296 Test: blockdev nvme admin passthru ...passed 00:09:05.296 Test: blockdev copy ...passed 00:09:05.296 Suite: bdevio tests on: Malloc1p1 00:09:05.296 Test: blockdev write read block ...passed 00:09:05.296 Test: blockdev write zeroes read block ...passed 00:09:05.296 Test: blockdev write zeroes read no split ...passed 00:09:05.296 Test: blockdev write zeroes read split ...passed 00:09:05.296 Test: blockdev write zeroes read split partial ...passed 00:09:05.296 Test: blockdev reset ...passed 00:09:05.296 Test: blockdev write read 8 blocks ...passed 00:09:05.296 Test: blockdev write read size > 128k ...passed 00:09:05.296 Test: blockdev write read invalid size ...passed 00:09:05.296 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.296 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.296 Test: blockdev write read max offset ...passed 00:09:05.296 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.296 Test: blockdev writev readv 8 blocks ...passed 00:09:05.296 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.296 Test: blockdev writev readv block ...passed 00:09:05.296 Test: blockdev writev readv size > 128k ...passed 00:09:05.296 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.296 Test: blockdev comparev and writev ...passed 00:09:05.296 Test: blockdev nvme passthru rw ...passed 00:09:05.296 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.296 Test: blockdev nvme admin passthru ...passed 00:09:05.296 Test: blockdev copy ...passed 00:09:05.296 Suite: bdevio tests on: Malloc1p0 00:09:05.296 Test: blockdev write read block ...passed 00:09:05.296 Test: blockdev write zeroes read block ...passed 00:09:05.296 Test: blockdev write zeroes read no split ...passed 00:09:05.296 Test: blockdev write zeroes read split ...passed 00:09:05.296 Test: blockdev write zeroes read split partial ...passed 00:09:05.296 Test: blockdev reset ...passed 00:09:05.296 Test: blockdev write read 8 blocks ...passed 00:09:05.296 Test: blockdev write read size > 128k ...passed 00:09:05.296 Test: blockdev write read invalid size ...passed 00:09:05.296 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.296 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.296 Test: blockdev write read max offset ...passed 00:09:05.296 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.296 Test: blockdev writev readv 8 blocks ...passed 00:09:05.296 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.296 Test: blockdev writev readv block ...passed 00:09:05.296 Test: blockdev writev readv size > 128k ...passed 00:09:05.296 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.296 Test: blockdev comparev and writev ...passed 00:09:05.296 Test: blockdev nvme passthru rw ...passed 00:09:05.296 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.296 Test: blockdev nvme admin passthru ...passed 00:09:05.296 Test: blockdev copy ...passed 00:09:05.296 Suite: bdevio tests on: Malloc0 00:09:05.296 Test: blockdev write read block ...passed 00:09:05.296 Test: blockdev write zeroes read block ...passed 00:09:05.296 Test: blockdev write zeroes read no split ...passed 00:09:05.296 Test: blockdev write zeroes read split ...passed 00:09:05.296 Test: blockdev write zeroes read split partial ...passed 00:09:05.296 Test: blockdev reset ...passed 00:09:05.296 Test: blockdev write read 8 blocks ...passed 00:09:05.296 Test: blockdev write read size > 128k ...passed 00:09:05.296 Test: blockdev write read invalid size ...passed 00:09:05.296 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:09:05.296 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:09:05.296 Test: blockdev write read max offset ...passed 00:09:05.296 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:09:05.296 Test: blockdev writev readv 8 blocks ...passed 00:09:05.296 Test: blockdev writev readv 30 x 1block ...passed 00:09:05.296 Test: blockdev writev readv block ...passed 00:09:05.296 Test: blockdev writev readv size > 128k ...passed 00:09:05.296 Test: blockdev writev readv size > 128k in two iovs ...passed 00:09:05.296 Test: blockdev comparev and writev ...passed 00:09:05.296 Test: blockdev nvme passthru rw ...passed 00:09:05.296 Test: blockdev nvme passthru vendor specific ...passed 00:09:05.296 Test: blockdev nvme admin passthru ...passed 00:09:05.296 Test: blockdev copy ...passed 00:09:05.296 00:09:05.296 Run Summary: Type Total Ran Passed Failed Inactive 00:09:05.296 suites 16 16 n/a 0 0 00:09:05.296 tests 368 368 368 0 0 00:09:05.296 asserts 2224 2224 2224 0 n/a 00:09:05.296 00:09:05.296 Elapsed time = 0.671 seconds 00:09:05.296 0 00:09:05.296 13:17:45 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 847039 00:09:05.296 13:17:45 blockdev_general.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 847039 ']' 00:09:05.296 13:17:45 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 847039 00:09:05.296 13:17:45 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:09:05.296 13:17:45 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:05.296 13:17:45 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 847039 00:09:05.296 13:17:46 blockdev_general.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:05.296 13:17:46 blockdev_general.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:05.296 13:17:46 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 847039' 00:09:05.296 killing process with pid 847039 00:09:05.296 13:17:46 blockdev_general.bdev_bounds -- common/autotest_common.sh@969 -- # kill 847039 00:09:05.296 13:17:46 blockdev_general.bdev_bounds -- common/autotest_common.sh@974 -- # wait 847039 00:09:05.557 13:17:46 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:09:05.557 00:09:05.557 real 0m1.532s 00:09:05.557 user 0m3.860s 00:09:05.557 sys 0m0.365s 00:09:05.557 13:17:46 blockdev_general.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:05.557 13:17:46 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:09:05.557 ************************************ 00:09:05.557 END TEST bdev_bounds 00:09:05.557 ************************************ 00:09:05.557 13:17:46 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:05.557 13:17:46 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:09:05.557 13:17:46 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:05.557 13:17:46 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:05.557 ************************************ 00:09:05.557 START TEST bdev_nbd 00:09:05.557 ************************************ 00:09:05.557 13:17:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:09:05.557 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:09:05.557 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:09:05.557 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=16 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=16 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=847280 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 847280 /var/tmp/spdk-nbd.sock 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 847280 ']' 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:09:05.558 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:09:05.558 13:17:46 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:05.558 [2024-07-25 13:17:46.330873] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:09:05.558 [2024-07-25 13:17:46.330927] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:05.818 [2024-07-25 13:17:46.426131] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:05.818 [2024-07-25 13:17:46.502846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:06.078 [2024-07-25 13:17:46.634055] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:06.078 [2024-07-25 13:17:46.634096] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:06.078 [2024-07-25 13:17:46.634104] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:06.078 [2024-07-25 13:17:46.642064] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:06.078 [2024-07-25 13:17:46.642084] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:06.078 [2024-07-25 13:17:46.650076] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:06.078 [2024-07-25 13:17:46.650093] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:06.078 [2024-07-25 13:17:46.711033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:06.078 [2024-07-25 13:17:46.711068] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:06.078 [2024-07-25 13:17:46.711077] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e7ca60 00:09:06.078 [2024-07-25 13:17:46.711084] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:06.078 [2024-07-25 13:17:46.712260] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:06.078 [2024-07-25 13:17:46.712279] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:06.648 1+0 records in 00:09:06.648 1+0 records out 00:09:06.648 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00020879 s, 19.6 MB/s 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.648 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:06.908 1+0 records in 00:09:06.908 1+0 records out 00:09:06.908 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242898 s, 16.9 MB/s 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:06.908 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:07.168 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.168 1+0 records in 00:09:07.168 1+0 records out 00:09:07.169 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000308739 s, 13.3 MB/s 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.169 13:17:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.430 1+0 records in 00:09:07.430 1+0 records out 00:09:07.430 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00032662 s, 12.5 MB/s 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.430 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:09:07.690 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:09:07.690 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:09:07.690 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:09:07.690 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:09:07.690 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:07.690 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:07.690 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.691 1+0 records in 00:09:07.691 1+0 records out 00:09:07.691 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000311556 s, 13.1 MB/s 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.691 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:07.951 1+0 records in 00:09:07.951 1+0 records out 00:09:07.951 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000355302 s, 11.5 MB/s 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:07.951 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.212 1+0 records in 00:09:08.212 1+0 records out 00:09:08.212 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000353634 s, 11.6 MB/s 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.212 13:17:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.473 1+0 records in 00:09:08.473 1+0 records out 00:09:08.473 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000504832 s, 8.1 MB/s 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.473 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.734 1+0 records in 00:09:08.734 1+0 records out 00:09:08.734 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000480668 s, 8.5 MB/s 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.734 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:08.995 1+0 records in 00:09:08.995 1+0 records out 00:09:08.995 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000600356 s, 6.8 MB/s 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:08.995 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.257 1+0 records in 00:09:09.257 1+0 records out 00:09:09.257 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000574621 s, 7.1 MB/s 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.257 13:17:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.583 1+0 records in 00:09:09.583 1+0 records out 00:09:09.583 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000603464 s, 6.8 MB/s 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.583 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:09.844 1+0 records in 00:09:09.844 1+0 records out 00:09:09.844 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000548425 s, 7.5 MB/s 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:09.844 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.106 1+0 records in 00:09:10.106 1+0 records out 00:09:10.106 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00065095 s, 6.3 MB/s 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.106 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.368 1+0 records in 00:09:10.368 1+0 records out 00:09:10.368 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000616556 s, 6.6 MB/s 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.368 13:17:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:10.627 1+0 records in 00:09:10.627 1+0 records out 00:09:10.627 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0006152 s, 6.7 MB/s 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.627 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:10.628 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:10.628 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:10.628 13:17:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:10.628 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:09:10.628 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:09:10.628 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd0", 00:09:10.887 "bdev_name": "Malloc0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd1", 00:09:10.887 "bdev_name": "Malloc1p0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd2", 00:09:10.887 "bdev_name": "Malloc1p1" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd3", 00:09:10.887 "bdev_name": "Malloc2p0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd4", 00:09:10.887 "bdev_name": "Malloc2p1" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd5", 00:09:10.887 "bdev_name": "Malloc2p2" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd6", 00:09:10.887 "bdev_name": "Malloc2p3" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd7", 00:09:10.887 "bdev_name": "Malloc2p4" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd8", 00:09:10.887 "bdev_name": "Malloc2p5" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd9", 00:09:10.887 "bdev_name": "Malloc2p6" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd10", 00:09:10.887 "bdev_name": "Malloc2p7" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd11", 00:09:10.887 "bdev_name": "TestPT" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd12", 00:09:10.887 "bdev_name": "raid0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd13", 00:09:10.887 "bdev_name": "concat0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd14", 00:09:10.887 "bdev_name": "raid1" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd15", 00:09:10.887 "bdev_name": "AIO0" 00:09:10.887 } 00:09:10.887 ]' 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd0", 00:09:10.887 "bdev_name": "Malloc0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd1", 00:09:10.887 "bdev_name": "Malloc1p0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd2", 00:09:10.887 "bdev_name": "Malloc1p1" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd3", 00:09:10.887 "bdev_name": "Malloc2p0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd4", 00:09:10.887 "bdev_name": "Malloc2p1" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd5", 00:09:10.887 "bdev_name": "Malloc2p2" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd6", 00:09:10.887 "bdev_name": "Malloc2p3" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd7", 00:09:10.887 "bdev_name": "Malloc2p4" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd8", 00:09:10.887 "bdev_name": "Malloc2p5" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd9", 00:09:10.887 "bdev_name": "Malloc2p6" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd10", 00:09:10.887 "bdev_name": "Malloc2p7" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd11", 00:09:10.887 "bdev_name": "TestPT" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd12", 00:09:10.887 "bdev_name": "raid0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd13", 00:09:10.887 "bdev_name": "concat0" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd14", 00:09:10.887 "bdev_name": "raid1" 00:09:10.887 }, 00:09:10.887 { 00:09:10.887 "nbd_device": "/dev/nbd15", 00:09:10.887 "bdev_name": "AIO0" 00:09:10.887 } 00:09:10.887 ]' 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:10.887 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.147 13:17:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.407 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.667 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:11.928 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.190 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.451 13:17:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.451 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.712 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:12.973 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.233 13:17:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.494 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:13.754 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:14.014 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:14.275 13:17:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:14.535 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:09:14.796 /dev/nbd0 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:14.796 1+0 records in 00:09:14.796 1+0 records out 00:09:14.796 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319742 s, 12.8 MB/s 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:14.796 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:09:15.057 /dev/nbd1 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:15.057 1+0 records in 00:09:15.057 1+0 records out 00:09:15.057 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246372 s, 16.6 MB/s 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:15.057 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:09:15.317 /dev/nbd10 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:15.317 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:15.317 1+0 records in 00:09:15.317 1+0 records out 00:09:15.317 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000325051 s, 12.6 MB/s 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:15.318 13:17:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:09:15.578 /dev/nbd11 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:15.578 1+0 records in 00:09:15.578 1+0 records out 00:09:15.578 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000319059 s, 12.8 MB/s 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:15.578 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:09:15.839 /dev/nbd12 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd12 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd12 /proc/partitions 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:15.839 1+0 records in 00:09:15.839 1+0 records out 00:09:15.839 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000392953 s, 10.4 MB/s 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:15.839 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:09:16.099 /dev/nbd13 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd13 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd13 /proc/partitions 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:16.099 1+0 records in 00:09:16.099 1+0 records out 00:09:16.099 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000313942 s, 13.0 MB/s 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.099 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:09:16.099 /dev/nbd14 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd14 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd14 /proc/partitions 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:16.360 1+0 records in 00:09:16.360 1+0 records out 00:09:16.360 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000396989 s, 10.3 MB/s 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.360 13:17:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:09:16.360 /dev/nbd15 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd15 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd15 /proc/partitions 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:16.621 1+0 records in 00:09:16.621 1+0 records out 00:09:16.621 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000532497 s, 7.7 MB/s 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.621 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:09:16.621 /dev/nbd2 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:16.882 1+0 records in 00:09:16.882 1+0 records out 00:09:16.882 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00188836 s, 2.2 MB/s 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:16.882 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:09:16.882 /dev/nbd3 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.143 1+0 records in 00:09:17.143 1+0 records out 00:09:17.143 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000622129 s, 6.6 MB/s 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.143 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:09:17.143 /dev/nbd4 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd4 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd4 /proc/partitions 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.403 1+0 records in 00:09:17.403 1+0 records out 00:09:17.403 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000485212 s, 8.4 MB/s 00:09:17.403 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.404 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:17.404 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.404 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:17.404 13:17:57 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:17.404 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.404 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.404 13:17:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:09:17.404 /dev/nbd5 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd5 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd5 /proc/partitions 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.664 1+0 records in 00:09:17.664 1+0 records out 00:09:17.664 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000529348 s, 7.7 MB/s 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.664 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:09:17.664 /dev/nbd6 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd6 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd6 /proc/partitions 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:17.925 1+0 records in 00:09:17.925 1+0 records out 00:09:17.925 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000660796 s, 6.2 MB/s 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:17.925 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:09:17.925 /dev/nbd7 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd7 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd7 /proc/partitions 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.185 1+0 records in 00:09:18.185 1+0 records out 00:09:18.185 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00067087 s, 6.1 MB/s 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.185 13:17:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:09:18.446 /dev/nbd8 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd8 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd8 /proc/partitions 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.446 1+0 records in 00:09:18.446 1+0 records out 00:09:18.446 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000549936 s, 7.4 MB/s 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.446 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:09:18.707 /dev/nbd9 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd9 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd9 /proc/partitions 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:18.707 1+0 records in 00:09:18.707 1+0 records out 00:09:18.707 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000648754 s, 6.3 MB/s 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:18.707 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:18.968 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd0", 00:09:18.968 "bdev_name": "Malloc0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd1", 00:09:18.968 "bdev_name": "Malloc1p0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd10", 00:09:18.968 "bdev_name": "Malloc1p1" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd11", 00:09:18.968 "bdev_name": "Malloc2p0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd12", 00:09:18.968 "bdev_name": "Malloc2p1" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd13", 00:09:18.968 "bdev_name": "Malloc2p2" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd14", 00:09:18.968 "bdev_name": "Malloc2p3" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd15", 00:09:18.968 "bdev_name": "Malloc2p4" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd2", 00:09:18.968 "bdev_name": "Malloc2p5" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd3", 00:09:18.968 "bdev_name": "Malloc2p6" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd4", 00:09:18.968 "bdev_name": "Malloc2p7" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd5", 00:09:18.968 "bdev_name": "TestPT" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd6", 00:09:18.968 "bdev_name": "raid0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd7", 00:09:18.968 "bdev_name": "concat0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd8", 00:09:18.968 "bdev_name": "raid1" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd9", 00:09:18.968 "bdev_name": "AIO0" 00:09:18.968 } 00:09:18.968 ]' 00:09:18.968 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd0", 00:09:18.968 "bdev_name": "Malloc0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd1", 00:09:18.968 "bdev_name": "Malloc1p0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd10", 00:09:18.968 "bdev_name": "Malloc1p1" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd11", 00:09:18.968 "bdev_name": "Malloc2p0" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd12", 00:09:18.968 "bdev_name": "Malloc2p1" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd13", 00:09:18.968 "bdev_name": "Malloc2p2" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd14", 00:09:18.968 "bdev_name": "Malloc2p3" 00:09:18.968 }, 00:09:18.968 { 00:09:18.968 "nbd_device": "/dev/nbd15", 00:09:18.968 "bdev_name": "Malloc2p4" 00:09:18.968 }, 00:09:18.968 { 00:09:18.969 "nbd_device": "/dev/nbd2", 00:09:18.969 "bdev_name": "Malloc2p5" 00:09:18.969 }, 00:09:18.969 { 00:09:18.969 "nbd_device": "/dev/nbd3", 00:09:18.969 "bdev_name": "Malloc2p6" 00:09:18.969 }, 00:09:18.969 { 00:09:18.969 "nbd_device": "/dev/nbd4", 00:09:18.969 "bdev_name": "Malloc2p7" 00:09:18.969 }, 00:09:18.969 { 00:09:18.969 "nbd_device": "/dev/nbd5", 00:09:18.969 "bdev_name": "TestPT" 00:09:18.969 }, 00:09:18.969 { 00:09:18.969 "nbd_device": "/dev/nbd6", 00:09:18.969 "bdev_name": "raid0" 00:09:18.969 }, 00:09:18.969 { 00:09:18.969 "nbd_device": "/dev/nbd7", 00:09:18.969 "bdev_name": "concat0" 00:09:18.969 }, 00:09:18.969 { 00:09:18.969 "nbd_device": "/dev/nbd8", 00:09:18.969 "bdev_name": "raid1" 00:09:18.969 }, 00:09:18.969 { 00:09:18.969 "nbd_device": "/dev/nbd9", 00:09:18.969 "bdev_name": "AIO0" 00:09:18.969 } 00:09:18.969 ]' 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:09:18.969 /dev/nbd1 00:09:18.969 /dev/nbd10 00:09:18.969 /dev/nbd11 00:09:18.969 /dev/nbd12 00:09:18.969 /dev/nbd13 00:09:18.969 /dev/nbd14 00:09:18.969 /dev/nbd15 00:09:18.969 /dev/nbd2 00:09:18.969 /dev/nbd3 00:09:18.969 /dev/nbd4 00:09:18.969 /dev/nbd5 00:09:18.969 /dev/nbd6 00:09:18.969 /dev/nbd7 00:09:18.969 /dev/nbd8 00:09:18.969 /dev/nbd9' 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:09:18.969 /dev/nbd1 00:09:18.969 /dev/nbd10 00:09:18.969 /dev/nbd11 00:09:18.969 /dev/nbd12 00:09:18.969 /dev/nbd13 00:09:18.969 /dev/nbd14 00:09:18.969 /dev/nbd15 00:09:18.969 /dev/nbd2 00:09:18.969 /dev/nbd3 00:09:18.969 /dev/nbd4 00:09:18.969 /dev/nbd5 00:09:18.969 /dev/nbd6 00:09:18.969 /dev/nbd7 00:09:18.969 /dev/nbd8 00:09:18.969 /dev/nbd9' 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:09:18.969 256+0 records in 00:09:18.969 256+0 records out 00:09:18.969 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0127365 s, 82.3 MB/s 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:09:18.969 256+0 records in 00:09:18.969 256+0 records out 00:09:18.969 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.13322 s, 7.9 MB/s 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:18.969 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:09:19.231 256+0 records in 00:09:19.231 256+0 records out 00:09:19.231 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.125266 s, 8.4 MB/s 00:09:19.231 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:19.231 13:17:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:09:19.491 256+0 records in 00:09:19.491 256+0 records out 00:09:19.491 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.138166 s, 7.6 MB/s 00:09:19.491 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:19.491 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:09:19.491 256+0 records in 00:09:19.491 256+0 records out 00:09:19.491 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.134477 s, 7.8 MB/s 00:09:19.491 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:19.491 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:09:19.752 256+0 records in 00:09:19.752 256+0 records out 00:09:19.752 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.131253 s, 8.0 MB/s 00:09:19.752 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:19.752 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:09:19.752 256+0 records in 00:09:19.752 256+0 records out 00:09:19.752 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.12481 s, 8.4 MB/s 00:09:19.752 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:19.752 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:09:20.011 256+0 records in 00:09:20.012 256+0 records out 00:09:20.012 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.132407 s, 7.9 MB/s 00:09:20.012 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.012 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:09:20.012 256+0 records in 00:09:20.012 256+0 records out 00:09:20.012 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.133527 s, 7.9 MB/s 00:09:20.012 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.012 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:09:20.272 256+0 records in 00:09:20.272 256+0 records out 00:09:20.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.12593 s, 8.3 MB/s 00:09:20.272 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.272 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:09:20.272 256+0 records in 00:09:20.272 256+0 records out 00:09:20.272 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.132307 s, 7.9 MB/s 00:09:20.272 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.272 13:18:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:09:20.533 256+0 records in 00:09:20.533 256+0 records out 00:09:20.533 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.131467 s, 8.0 MB/s 00:09:20.533 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.533 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:09:20.533 256+0 records in 00:09:20.533 256+0 records out 00:09:20.533 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.129475 s, 8.1 MB/s 00:09:20.533 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.533 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:09:20.793 256+0 records in 00:09:20.793 256+0 records out 00:09:20.793 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.129847 s, 8.1 MB/s 00:09:20.793 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.793 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:09:20.793 256+0 records in 00:09:20.794 256+0 records out 00:09:20.794 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.133092 s, 7.9 MB/s 00:09:20.794 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:20.794 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:09:21.054 256+0 records in 00:09:21.054 256+0 records out 00:09:21.054 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.134414 s, 7.8 MB/s 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:09:21.054 256+0 records in 00:09:21.054 256+0 records out 00:09:21.054 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.129732 s, 8.1 MB/s 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:09:21.054 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.055 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:21.315 13:18:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:21.576 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:21.836 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:22.096 13:18:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:22.357 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:22.617 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:22.878 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.174 13:18:03 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:09:23.434 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:09:23.434 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:09:23.434 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:09:23.435 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.435 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.435 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:09:23.435 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.435 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.435 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.435 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:09:23.694 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.695 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:23.955 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.215 13:18:04 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.475 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.736 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:24.737 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:09:24.998 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:09:25.259 malloc_lvol_verify 00:09:25.259 13:18:05 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:09:25.520 f41700aa-1522-4a20-980d-86283fd6fc67 00:09:25.520 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:09:25.781 32741a26-6a1a-48b7-b371-ada2301f17ca 00:09:25.781 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:09:25.781 /dev/nbd0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:09:26.043 mke2fs 1.46.5 (30-Dec-2021) 00:09:26.043 Discarding device blocks: 0/4096 done 00:09:26.043 Creating filesystem with 4096 1k blocks and 1024 inodes 00:09:26.043 00:09:26.043 Allocating group tables: 0/1 done 00:09:26.043 Writing inode tables: 0/1 done 00:09:26.043 Creating journal (1024 blocks): done 00:09:26.043 Writing superblocks and filesystem accounting information: 0/1 done 00:09:26.043 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 847280 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 847280 ']' 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 847280 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:09:26.043 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 847280 00:09:26.304 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:09:26.304 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:09:26.304 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 847280' 00:09:26.304 killing process with pid 847280 00:09:26.304 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@969 -- # kill 847280 00:09:26.304 13:18:06 blockdev_general.bdev_nbd -- common/autotest_common.sh@974 -- # wait 847280 00:09:26.565 13:18:07 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:09:26.565 00:09:26.565 real 0m20.842s 00:09:26.565 user 0m28.466s 00:09:26.565 sys 0m8.769s 00:09:26.565 13:18:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:26.565 13:18:07 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:09:26.565 ************************************ 00:09:26.565 END TEST bdev_nbd 00:09:26.565 ************************************ 00:09:26.565 13:18:07 blockdev_general -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:09:26.565 13:18:07 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = nvme ']' 00:09:26.565 13:18:07 blockdev_general -- bdev/blockdev.sh@763 -- # '[' bdev = gpt ']' 00:09:26.565 13:18:07 blockdev_general -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:09:26.565 13:18:07 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:09:26.565 13:18:07 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:26.565 13:18:07 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:26.565 ************************************ 00:09:26.565 START TEST bdev_fio 00:09:26.565 ************************************ 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:26.565 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc0]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc0 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p0]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p0 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc1p1]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc1p1 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p0]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p0 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p1]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p1 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p2]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p2 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p3]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p3 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p4]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p4 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p5]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p5 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p6]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p6 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_Malloc2p7]' 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=Malloc2p7 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.565 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_TestPT]' 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=TestPT 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid0]' 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid0 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_concat0]' 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=concat0 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_raid1]' 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=raid1 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_AIO0]' 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=AIO0 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:26.566 13:18:07 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:26.566 ************************************ 00:09:26.566 START TEST bdev_fio_rw_verify 00:09:26.566 ************************************ 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:09:26.566 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:26.846 13:18:07 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:27.106 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:27.106 fio-3.35 00:09:27.106 Starting 16 threads 00:09:39.405 00:09:39.405 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=851691: Thu Jul 25 13:18:18 2024 00:09:39.405 read: IOPS=106k, BW=414MiB/s (434MB/s)(4138MiB/10001msec) 00:09:39.405 slat (usec): min=2, max=3361, avg=28.12, stdev=18.39 00:09:39.405 clat (usec): min=7, max=3771, avg=243.46, stdev=135.70 00:09:39.405 lat (usec): min=11, max=3789, avg=271.58, stdev=144.11 00:09:39.405 clat percentiles (usec): 00:09:39.405 | 50.000th=[ 231], 99.000th=[ 627], 99.900th=[ 816], 99.990th=[ 1020], 00:09:39.405 | 99.999th=[ 1237] 00:09:39.405 write: IOPS=165k, BW=643MiB/s (674MB/s)(6356MiB/9883msec); 0 zone resets 00:09:39.405 slat (usec): min=3, max=547, avg=42.42, stdev=20.89 00:09:39.405 clat (usec): min=7, max=1731, avg=303.46, stdev=161.56 00:09:39.405 lat (usec): min=26, max=1826, avg=345.88, stdev=171.38 00:09:39.405 clat percentiles (usec): 00:09:39.405 | 50.000th=[ 285], 99.000th=[ 775], 99.900th=[ 1037], 99.990th=[ 1205], 00:09:39.405 | 99.999th=[ 1352] 00:09:39.405 bw ( KiB/s): min=466168, max=790616, per=98.40%, avg=648011.79, stdev=6399.41, samples=304 00:09:39.405 iops : min=116542, max=197654, avg=162002.84, stdev=1599.85, samples=304 00:09:39.405 lat (usec) : 10=0.01%, 20=0.11%, 50=2.10%, 100=8.82%, 250=36.59% 00:09:39.405 lat (usec) : 500=43.57%, 750=7.99%, 1000=0.73% 00:09:39.405 lat (msec) : 2=0.09%, 4=0.01% 00:09:39.405 cpu : usr=99.33%, sys=0.29%, ctx=586, majf=0, minf=2546 00:09:39.405 IO depths : 1=12.4%, 2=24.8%, 4=50.2%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:39.405 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:39.405 complete : 0=0.0%, 4=89.0%, 8=11.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:39.405 issued rwts: total=1059436,1627051,0,0 short=0,0,0,0 dropped=0,0,0,0 00:09:39.405 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:39.405 00:09:39.405 Run status group 0 (all jobs): 00:09:39.405 READ: bw=414MiB/s (434MB/s), 414MiB/s-414MiB/s (434MB/s-434MB/s), io=4138MiB (4339MB), run=10001-10001msec 00:09:39.405 WRITE: bw=643MiB/s (674MB/s), 643MiB/s-643MiB/s (674MB/s-674MB/s), io=6356MiB (6664MB), run=9883-9883msec 00:09:39.405 00:09:39.405 real 0m11.370s 00:09:39.405 user 2m46.875s 00:09:39.405 sys 0m2.041s 00:09:39.405 13:18:18 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:39.405 13:18:18 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:09:39.405 ************************************ 00:09:39.405 END TEST bdev_fio_rw_verify 00:09:39.405 ************************************ 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:09:39.405 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:39.407 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "662766cf-a329-4b13-8bbf-beb7530fad19"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "662766cf-a329-4b13-8bbf-beb7530fad19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "8499c77c-0f29-5e8a-8896-2004e873e147"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8499c77c-0f29-5e8a-8896-2004e873e147",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "7a3a2d7f-485d-5e9f-893e-9b0f4d59dc62"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7a3a2d7f-485d-5e9f-893e-9b0f4d59dc62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f37f913b-e042-5a4b-8baf-39ef74035b81"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f37f913b-e042-5a4b-8baf-39ef74035b81",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "e8b18e7a-7da5-5fa0-8b5f-ab69b95611fd"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e8b18e7a-7da5-5fa0-8b5f-ab69b95611fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "d7540043-92ff-5f7a-998d-91853a2b9c68"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d7540043-92ff-5f7a-998d-91853a2b9c68",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "11162537-5c21-5fc0-95a0-e3b5fc1714c9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11162537-5c21-5fc0-95a0-e3b5fc1714c9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b7cfacdd-932e-5978-8bfd-0b0b57b936b1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b7cfacdd-932e-5978-8bfd-0b0b57b936b1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "7cfbd5e3-03d8-55ab-a794-7e265fe1ec0e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7cfbd5e3-03d8-55ab-a794-7e265fe1ec0e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "41c854c2-376d-5540-938f-6c103f9acba1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "41c854c2-376d-5540-938f-6c103f9acba1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b70a9f28-ebd6-565c-a639-8069043f8871"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b70a9f28-ebd6-565c-a639-8069043f8871",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f65e4131-bbc3-5b92-92e2-60bbb5306484"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f65e4131-bbc3-5b92-92e2-60bbb5306484",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "9ff378bc-36fb-48cf-83cf-be3bb4f804cb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "9ff378bc-36fb-48cf-83cf-be3bb4f804cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9ff378bc-36fb-48cf-83cf-be3bb4f804cb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5c832d3d-b2d2-4dd8-b3d8-3fadcabe7ea4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5e0865da-3075-428a-9538-df4fb2ece21d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "d2995536-33da-47ba-ae0d-bbe563b5ee92"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d2995536-33da-47ba-ae0d-bbe563b5ee92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d2995536-33da-47ba-ae0d-bbe563b5ee92",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "7e5009b0-a0a0-4e3c-b6e7-fdf6341a30c4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "06d99b35-06ec-4c4c-acc5-2eb5d368ef4a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "893433b6-0433-4c8b-aa5e-c9ef348563b6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "9f1270e8-d25a-42d6-8b71-baa792c5685b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "b6ec2202-c66a-4eb8-92c9-20ef283bbe95"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "b6ec2202-c66a-4eb8-92c9-20ef283bbe95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:39.407 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n Malloc0 00:09:39.407 Malloc1p0 00:09:39.407 Malloc1p1 00:09:39.407 Malloc2p0 00:09:39.407 Malloc2p1 00:09:39.407 Malloc2p2 00:09:39.407 Malloc2p3 00:09:39.407 Malloc2p4 00:09:39.407 Malloc2p5 00:09:39.407 Malloc2p6 00:09:39.407 Malloc2p7 00:09:39.407 TestPT 00:09:39.407 raid0 00:09:39.407 concat0 ]] 00:09:39.407 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "662766cf-a329-4b13-8bbf-beb7530fad19"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "662766cf-a329-4b13-8bbf-beb7530fad19",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "8499c77c-0f29-5e8a-8896-2004e873e147"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "8499c77c-0f29-5e8a-8896-2004e873e147",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "7a3a2d7f-485d-5e9f-893e-9b0f4d59dc62"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "7a3a2d7f-485d-5e9f-893e-9b0f4d59dc62",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "f37f913b-e042-5a4b-8baf-39ef74035b81"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "f37f913b-e042-5a4b-8baf-39ef74035b81",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "e8b18e7a-7da5-5fa0-8b5f-ab69b95611fd"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e8b18e7a-7da5-5fa0-8b5f-ab69b95611fd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "d7540043-92ff-5f7a-998d-91853a2b9c68"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "d7540043-92ff-5f7a-998d-91853a2b9c68",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "11162537-5c21-5fc0-95a0-e3b5fc1714c9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "11162537-5c21-5fc0-95a0-e3b5fc1714c9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "b7cfacdd-932e-5978-8bfd-0b0b57b936b1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b7cfacdd-932e-5978-8bfd-0b0b57b936b1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "7cfbd5e3-03d8-55ab-a794-7e265fe1ec0e"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7cfbd5e3-03d8-55ab-a794-7e265fe1ec0e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "41c854c2-376d-5540-938f-6c103f9acba1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "41c854c2-376d-5540-938f-6c103f9acba1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "b70a9f28-ebd6-565c-a639-8069043f8871"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "b70a9f28-ebd6-565c-a639-8069043f8871",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "f65e4131-bbc3-5b92-92e2-60bbb5306484"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f65e4131-bbc3-5b92-92e2-60bbb5306484",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": true,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": true,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "9ff378bc-36fb-48cf-83cf-be3bb4f804cb"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "9ff378bc-36fb-48cf-83cf-be3bb4f804cb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "9ff378bc-36fb-48cf-83cf-be3bb4f804cb",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "5c832d3d-b2d2-4dd8-b3d8-3fadcabe7ea4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "5e0865da-3075-428a-9538-df4fb2ece21d",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "d2995536-33da-47ba-ae0d-bbe563b5ee92"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d2995536-33da-47ba-ae0d-bbe563b5ee92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d2995536-33da-47ba-ae0d-bbe563b5ee92",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "7e5009b0-a0a0-4e3c-b6e7-fdf6341a30c4",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "06d99b35-06ec-4c4c-acc5-2eb5d368ef4a",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": false,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "211d6cb5-8420-46e9-8e58-7cbd2cb6ceea",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "893433b6-0433-4c8b-aa5e-c9ef348563b6",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "9f1270e8-d25a-42d6-8b71-baa792c5685b",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "b6ec2202-c66a-4eb8-92c9-20ef283bbe95"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "b6ec2202-c66a-4eb8-92c9-20ef283bbe95",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc0]' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc0 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p0]' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p0 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc1p1]' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc1p1 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p0]' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p0 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p1]' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p1 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p2]' 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p2 00:09:39.408 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p3]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p3 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p4]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p4 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p5]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p5 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p6]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p6 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_Malloc2p7]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=Malloc2p7 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_TestPT]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=TestPT 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_raid0]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=raid0 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_concat0]' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=concat0 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:39.409 13:18:18 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:39.409 ************************************ 00:09:39.409 START TEST bdev_fio_trim 00:09:39.409 ************************************ 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:09:39.409 13:18:18 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:09:39.409 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:09:39.409 fio-3.35 00:09:39.409 Starting 14 threads 00:09:49.414 00:09:49.414 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=854356: Thu Jul 25 13:18:30 2024 00:09:49.414 write: IOPS=183k, BW=716MiB/s (751MB/s)(7164MiB/10001msec); 0 zone resets 00:09:49.414 slat (usec): min=2, max=415, avg=27.03, stdev=12.50 00:09:49.414 clat (usec): min=9, max=2162, avg=199.25, stdev=78.01 00:09:49.414 lat (usec): min=18, max=2166, avg=226.28, stdev=79.72 00:09:49.414 clat percentiles (usec): 00:09:49.414 | 50.000th=[ 190], 99.000th=[ 412], 99.900th=[ 502], 99.990th=[ 570], 00:09:49.414 | 99.999th=[ 783] 00:09:49.414 bw ( KiB/s): min=701888, max=829440, per=100.00%, avg=735177.21, stdev=3468.24, samples=266 00:09:49.414 iops : min=175472, max=207360, avg=183794.32, stdev=867.04, samples=266 00:09:49.414 trim: IOPS=183k, BW=716MiB/s (751MB/s)(7164MiB/10001msec); 0 zone resets 00:09:49.414 slat (usec): min=3, max=147, avg=17.51, stdev= 7.68 00:09:49.414 clat (usec): min=3, max=2034, avg=214.16, stdev=82.23 00:09:49.414 lat (usec): min=9, max=2042, avg=231.67, stdev=84.60 00:09:49.414 clat percentiles (usec): 00:09:49.414 | 50.000th=[ 210], 99.000th=[ 408], 99.900th=[ 461], 99.990th=[ 545], 00:09:49.414 | 99.999th=[ 725] 00:09:49.414 bw ( KiB/s): min=701888, max=829440, per=100.00%, avg=735177.63, stdev=3468.22, samples=266 00:09:49.414 iops : min=175472, max=207360, avg=183794.21, stdev=867.04, samples=266 00:09:49.414 lat (usec) : 4=0.01%, 10=0.12%, 20=0.30%, 50=1.23%, 100=5.95% 00:09:49.414 lat (usec) : 250=64.22%, 500=28.11%, 750=0.06%, 1000=0.01% 00:09:49.414 lat (msec) : 2=0.01%, 4=0.01% 00:09:49.414 cpu : usr=99.70%, sys=0.00%, ctx=476, majf=0, minf=872 00:09:49.414 IO depths : 1=12.2%, 2=24.5%, 4=50.0%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:09:49.414 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:49.414 complete : 0=0.0%, 4=89.2%, 8=10.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:09:49.414 issued rwts: total=0,1833871,1833876,0 short=0,0,0,0 dropped=0,0,0,0 00:09:49.414 latency : target=0, window=0, percentile=100.00%, depth=8 00:09:49.414 00:09:49.414 Run status group 0 (all jobs): 00:09:49.414 WRITE: bw=716MiB/s (751MB/s), 716MiB/s-716MiB/s (751MB/s-751MB/s), io=7164MiB (7512MB), run=10001-10001msec 00:09:49.414 TRIM: bw=716MiB/s (751MB/s), 716MiB/s-716MiB/s (751MB/s-751MB/s), io=7164MiB (7512MB), run=10001-10001msec 00:09:49.674 00:09:49.674 real 0m11.431s 00:09:49.674 user 2m28.470s 00:09:49.674 sys 0m1.358s 00:09:49.674 13:18:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:49.674 13:18:30 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:09:49.674 ************************************ 00:09:49.674 END TEST bdev_fio_trim 00:09:49.674 ************************************ 00:09:49.674 13:18:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:09:49.674 13:18:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:09:49.674 13:18:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:09:49.674 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:09:49.674 13:18:30 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:09:49.674 00:09:49.674 real 0m23.167s 00:09:49.674 user 5m15.545s 00:09:49.674 sys 0m3.592s 00:09:49.674 13:18:30 blockdev_general.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:49.674 13:18:30 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:09:49.674 ************************************ 00:09:49.674 END TEST bdev_fio 00:09:49.674 ************************************ 00:09:49.674 13:18:30 blockdev_general -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:09:49.674 13:18:30 blockdev_general -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:49.674 13:18:30 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:09:49.674 13:18:30 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:49.674 13:18:30 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:49.674 ************************************ 00:09:49.674 START TEST bdev_verify 00:09:49.674 ************************************ 00:09:49.674 13:18:30 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:09:49.935 [2024-07-25 13:18:30.539989] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:09:49.935 [2024-07-25 13:18:30.540113] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid856057 ] 00:09:49.935 [2024-07-25 13:18:30.683472] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:50.196 [2024-07-25 13:18:30.756569] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:50.197 [2024-07-25 13:18:30.756579] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:50.197 [2024-07-25 13:18:30.894874] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:50.197 [2024-07-25 13:18:30.894934] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:50.197 [2024-07-25 13:18:30.894943] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:50.197 [2024-07-25 13:18:30.902880] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:50.197 [2024-07-25 13:18:30.902902] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:50.197 [2024-07-25 13:18:30.910894] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:50.197 [2024-07-25 13:18:30.910914] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:50.197 [2024-07-25 13:18:30.985229] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:50.197 [2024-07-25 13:18:30.985289] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:50.197 [2024-07-25 13:18:30.985299] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2470a40 00:09:50.197 [2024-07-25 13:18:30.985306] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:50.197 [2024-07-25 13:18:30.986863] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:50.197 [2024-07-25 13:18:30.986900] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:50.456 Running I/O for 5 seconds... 00:09:57.038 00:09:57.038 Latency(us) 00:09:57.038 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:57.038 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.038 Verification LBA range: start 0x0 length 0x1000 00:09:57.038 Malloc0 : 5.19 1356.88 5.30 0.00 0.00 94148.37 412.75 301667.25 00:09:57.038 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.038 Verification LBA range: start 0x1000 length 0x1000 00:09:57.038 Malloc0 : 5.22 932.13 3.64 0.00 0.00 136974.66 630.15 419430.40 00:09:57.038 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.038 Verification LBA range: start 0x0 length 0x800 00:09:57.038 Malloc1p0 : 5.22 711.15 2.78 0.00 0.00 179154.73 2029.10 175031.53 00:09:57.038 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x800 length 0x800 00:09:57.045 Malloc1p0 : 5.22 490.35 1.92 0.00 0.00 259556.52 2986.93 229073.53 00:09:57.045 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x800 00:09:57.045 Malloc1p1 : 5.22 710.92 2.78 0.00 0.00 178848.23 2016.49 167772.16 00:09:57.045 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x800 length 0x800 00:09:57.045 Malloc1p1 : 5.23 489.89 1.91 0.00 0.00 259066.01 3049.94 227460.33 00:09:57.045 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p0 : 5.23 710.36 2.77 0.00 0.00 178642.83 2079.51 158899.59 00:09:57.045 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p0 : 5.23 489.41 1.91 0.00 0.00 258517.53 2974.33 225847.14 00:09:57.045 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p1 : 5.23 709.79 2.77 0.00 0.00 178426.72 1890.46 148413.83 00:09:57.045 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p1 : 5.24 488.98 1.91 0.00 0.00 258015.49 3049.94 222620.75 00:09:57.045 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p2 : 5.23 709.56 2.77 0.00 0.00 178115.50 2003.89 148413.83 00:09:57.045 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p2 : 5.24 488.48 1.91 0.00 0.00 257461.99 2911.31 217781.17 00:09:57.045 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p3 : 5.24 709.03 2.77 0.00 0.00 177913.01 2104.71 141961.06 00:09:57.045 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p3 : 5.25 488.08 1.91 0.00 0.00 256960.19 3125.56 209715.20 00:09:57.045 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p4 : 5.24 708.40 2.77 0.00 0.00 177694.74 1915.67 141961.06 00:09:57.045 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p4 : 5.25 487.55 1.90 0.00 0.00 256399.53 2785.28 200842.63 00:09:57.045 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p5 : 5.24 708.16 2.77 0.00 0.00 177395.58 1978.68 140347.86 00:09:57.045 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p5 : 5.26 486.98 1.90 0.00 0.00 256042.94 3163.37 196003.05 00:09:57.045 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p6 : 5.25 707.71 2.76 0.00 0.00 177175.06 2180.33 137121.48 00:09:57.045 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p6 : 5.26 486.59 1.90 0.00 0.00 255410.77 3049.94 188743.68 00:09:57.045 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x200 00:09:57.045 Malloc2p7 : 5.25 707.03 2.76 0.00 0.00 176960.82 1903.06 140347.86 00:09:57.045 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x200 length 0x200 00:09:57.045 Malloc2p7 : 5.27 486.09 1.90 0.00 0.00 254983.84 2860.90 199229.44 00:09:57.045 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x1000 00:09:57.045 TestPT : 5.25 687.39 2.69 0.00 0.00 180475.22 7662.67 140347.86 00:09:57.045 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x1000 length 0x1000 00:09:57.045 TestPT : 5.29 483.99 1.89 0.00 0.00 255402.14 18652.55 199229.44 00:09:57.045 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x2000 00:09:57.045 raid0 : 5.26 706.21 2.76 0.00 0.00 176414.85 2394.58 149220.43 00:09:57.045 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x2000 length 0x2000 00:09:57.045 raid0 : 5.27 485.59 1.90 0.00 0.00 253684.02 3251.59 200036.04 00:09:57.045 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x2000 00:09:57.045 concat0 : 5.26 705.88 2.76 0.00 0.00 176120.16 2167.73 158899.59 00:09:57.045 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x2000 length 0x2000 00:09:57.045 concat0 : 5.27 485.36 1.90 0.00 0.00 252975.08 3327.21 203262.42 00:09:57.045 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x1000 00:09:57.045 raid1 : 5.26 705.53 2.76 0.00 0.00 175836.23 2470.20 169385.35 00:09:57.045 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x1000 length 0x1000 00:09:57.045 raid1 : 5.30 507.47 1.98 0.00 0.00 241175.49 2709.66 208102.01 00:09:57.045 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x0 length 0x4e2 00:09:57.045 AIO0 : 5.27 704.96 2.75 0.00 0.00 175646.25 995.64 176644.73 00:09:57.045 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:09:57.045 Verification LBA range: start 0x4e2 length 0x4e2 00:09:57.045 AIO0 : 5.30 507.28 1.98 0.00 0.00 240482.90 1575.38 217781.17 00:09:57.045 =================================================================================================================== 00:09:57.045 Total : 20243.16 79.07 0.00 0.00 198170.39 412.75 419430.40 00:09:57.045 00:09:57.045 real 0m6.327s 00:09:57.045 user 0m11.741s 00:09:57.045 sys 0m0.373s 00:09:57.045 13:18:36 blockdev_general.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:09:57.045 13:18:36 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:09:57.045 ************************************ 00:09:57.045 END TEST bdev_verify 00:09:57.045 ************************************ 00:09:57.045 13:18:36 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:57.045 13:18:36 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:09:57.045 13:18:36 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:09:57.045 13:18:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:57.045 ************************************ 00:09:57.045 START TEST bdev_verify_big_io 00:09:57.045 ************************************ 00:09:57.045 13:18:36 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:09:57.045 [2024-07-25 13:18:36.897122] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:09:57.045 [2024-07-25 13:18:36.897167] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid857270 ] 00:09:57.045 [2024-07-25 13:18:36.985669] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:57.045 [2024-07-25 13:18:37.055211] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:09:57.046 [2024-07-25 13:18:37.055214] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:09:57.046 [2024-07-25 13:18:37.176442] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:57.046 [2024-07-25 13:18:37.176480] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:09:57.046 [2024-07-25 13:18:37.176488] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:09:57.046 [2024-07-25 13:18:37.184442] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:57.046 [2024-07-25 13:18:37.184461] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:09:57.046 [2024-07-25 13:18:37.192454] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:57.046 [2024-07-25 13:18:37.192470] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:09:57.046 [2024-07-25 13:18:37.253647] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:09:57.046 [2024-07-25 13:18:37.253685] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:57.046 [2024-07-25 13:18:37.253695] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x187ab80 00:09:57.046 [2024-07-25 13:18:37.253701] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:57.046 [2024-07-25 13:18:37.254876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:57.046 [2024-07-25 13:18:37.254896] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:09:57.046 [2024-07-25 13:18:37.396305] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.397194] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.398422] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.399346] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.400633] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.401518] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.402812] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.403903] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.404635] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.405695] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.406413] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.407495] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.408225] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.409276] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.410002] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.411054] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:09:57.046 [2024-07-25 13:18:37.426499] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:57.046 [2024-07-25 13:18:37.427995] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:09:57.046 Running I/O for 5 seconds... 00:10:05.180 00:10:05.180 Latency(us) 00:10:05.180 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:05.180 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x100 00:10:05.180 Malloc0 : 6.05 105.78 6.61 0.00 0.00 1184626.31 734.13 3123143.29 00:10:05.180 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x100 length 0x100 00:10:05.180 Malloc0 : 6.42 99.72 6.23 0.00 0.00 1252040.67 1109.07 3200576.59 00:10:05.180 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x80 00:10:05.180 Malloc1p0 : 6.52 74.90 4.68 0.00 0.00 1546715.80 1877.86 2916654.47 00:10:05.180 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x80 length 0x80 00:10:05.180 Malloc1p0 : 7.37 28.22 1.76 0.00 0.00 4059981.95 1777.03 6323719.88 00:10:05.180 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x80 00:10:05.180 Malloc1p1 : 6.83 32.81 2.05 0.00 0.00 3427512.02 1235.10 5730064.54 00:10:05.180 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x80 length 0x80 00:10:05.180 Malloc1p1 : 7.37 28.22 1.76 0.00 0.00 3884152.21 1777.03 6039797.76 00:10:05.180 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p0 : 6.52 22.10 1.38 0.00 0.00 1276525.00 475.77 2090699.22 00:10:05.180 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p0 : 6.72 16.68 1.04 0.00 0.00 1618351.81 690.02 2671449.01 00:10:05.180 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p1 : 6.52 22.09 1.38 0.00 0.00 1263905.60 494.67 2051982.57 00:10:05.180 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p1 : 6.72 16.67 1.04 0.00 0.00 1598373.60 693.17 2632732.36 00:10:05.180 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p2 : 6.52 22.09 1.38 0.00 0.00 1250757.02 497.82 2026171.47 00:10:05.180 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p2 : 6.72 16.67 1.04 0.00 0.00 1578175.65 724.68 2594015.70 00:10:05.180 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p3 : 6.52 22.09 1.38 0.00 0.00 1239630.63 488.37 2000360.37 00:10:05.180 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p3 : 6.89 18.57 1.16 0.00 0.00 1421373.90 705.77 2555299.05 00:10:05.180 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p4 : 6.52 22.08 1.38 0.00 0.00 1227612.46 488.37 1974549.27 00:10:05.180 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p4 : 6.89 18.57 1.16 0.00 0.00 1402943.74 690.02 2516582.40 00:10:05.180 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p5 : 6.52 22.08 1.38 0.00 0.00 1215427.12 488.37 1948738.17 00:10:05.180 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p5 : 6.90 18.56 1.16 0.00 0.00 1385500.54 683.72 2477865.75 00:10:05.180 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p6 : 6.52 22.07 1.38 0.00 0.00 1202932.18 535.63 1922927.06 00:10:05.180 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p6 : 6.90 18.56 1.16 0.00 0.00 1368042.52 696.32 2452054.65 00:10:05.180 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x20 00:10:05.180 Malloc2p7 : 6.52 22.07 1.38 0.00 0.00 1190597.39 488.37 1897115.96 00:10:05.180 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x20 length 0x20 00:10:05.180 Malloc2p7 : 6.90 18.55 1.16 0.00 0.00 1350721.19 712.07 2413337.99 00:10:05.180 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x100 00:10:05.180 TestPT : 6.72 33.62 2.10 0.00 0.00 3028261.08 105664.20 3974909.64 00:10:05.180 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x100 length 0x100 00:10:05.180 TestPT : 7.39 28.14 1.76 0.00 0.00 3430635.10 138734.67 4052342.94 00:10:05.180 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x200 00:10:05.180 raid0 : 6.77 37.80 2.36 0.00 0.00 2596381.92 1285.51 4852487.09 00:10:05.180 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x200 length 0x200 00:10:05.180 raid0 : 7.20 38.63 2.41 0.00 0.00 2438033.25 1890.46 4955731.50 00:10:05.180 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x200 00:10:05.180 concat0 : 7.11 40.53 2.53 0.00 0.00 2302509.81 1272.91 4671809.38 00:10:05.180 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x200 length 0x200 00:10:05.180 concat0 : 7.26 44.05 2.75 0.00 0.00 2028660.28 1915.67 4749242.68 00:10:05.180 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x100 00:10:05.180 raid1 : 6.98 52.73 3.30 0.00 0.00 1747144.50 1663.61 4465320.57 00:10:05.180 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x100 length 0x100 00:10:05.180 raid1 : 7.40 71.39 4.46 0.00 0.00 1201166.91 2508.01 4542753.87 00:10:05.180 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x0 length 0x4e 00:10:05.180 AIO0 : 7.11 62.18 3.89 0.00 0.00 881677.84 529.33 3148954.39 00:10:05.180 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:10:05.180 Verification LBA range: start 0x4e length 0x4e 00:10:05.180 AIO0 : 7.60 115.72 7.23 0.00 0.00 438968.50 563.99 3690987.52 00:10:05.180 =================================================================================================================== 00:10:05.180 Total : 1213.91 75.87 0.00 0.00 1643008.57 475.77 6323719.88 00:10:05.180 00:10:05.180 real 0m8.494s 00:10:05.180 user 0m16.308s 00:10:05.180 sys 0m0.280s 00:10:05.180 13:18:45 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:05.180 13:18:45 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:10:05.180 ************************************ 00:10:05.180 END TEST bdev_verify_big_io 00:10:05.180 ************************************ 00:10:05.180 13:18:45 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:05.180 13:18:45 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:10:05.180 13:18:45 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:05.180 13:18:45 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:05.180 ************************************ 00:10:05.180 START TEST bdev_write_zeroes 00:10:05.180 ************************************ 00:10:05.180 13:18:45 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:05.180 [2024-07-25 13:18:45.462156] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:05.180 [2024-07-25 13:18:45.462225] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid858647 ] 00:10:05.180 [2024-07-25 13:18:45.564567] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:05.180 [2024-07-25 13:18:45.641364] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:05.181 [2024-07-25 13:18:45.777083] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:05.181 [2024-07-25 13:18:45.777119] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:10:05.181 [2024-07-25 13:18:45.777128] vbdev_passthru.c: 736:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:10:05.181 [2024-07-25 13:18:45.785092] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:05.181 [2024-07-25 13:18:45.785115] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:10:05.181 [2024-07-25 13:18:45.793104] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:05.181 [2024-07-25 13:18:45.793121] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:10:05.181 [2024-07-25 13:18:45.853983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:10:05.181 [2024-07-25 13:18:45.854020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:05.181 [2024-07-25 13:18:45.854030] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b08920 00:10:05.181 [2024-07-25 13:18:45.854036] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:05.181 [2024-07-25 13:18:45.855169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:05.181 [2024-07-25 13:18:45.855189] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:10:05.440 Running I/O for 1 seconds... 00:10:06.381 00:10:06.381 Latency(us) 00:10:06.381 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:06.381 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc0 : 1.03 5981.41 23.36 0.00 0.00 21390.99 513.58 35691.91 00:10:06.381 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc1p0 : 1.03 5973.95 23.34 0.00 0.00 21388.14 746.73 35086.97 00:10:06.381 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc1p1 : 1.03 5966.55 23.31 0.00 0.00 21373.04 743.58 34280.37 00:10:06.381 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p0 : 1.03 5959.19 23.28 0.00 0.00 21361.61 743.58 33675.42 00:10:06.381 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p1 : 1.03 5951.81 23.25 0.00 0.00 21346.86 759.34 32868.82 00:10:06.381 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p2 : 1.03 5944.46 23.22 0.00 0.00 21333.59 743.58 32062.23 00:10:06.381 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p3 : 1.03 5937.15 23.19 0.00 0.00 21323.29 727.83 31457.28 00:10:06.381 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p4 : 1.04 5929.82 23.16 0.00 0.00 21306.04 756.18 30650.68 00:10:06.381 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p5 : 1.05 5972.29 23.33 0.00 0.00 21118.43 734.13 29844.09 00:10:06.381 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p6 : 1.05 5965.05 23.30 0.00 0.00 21107.15 734.13 29239.14 00:10:06.381 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 Malloc2p7 : 1.05 5957.80 23.27 0.00 0.00 21091.74 756.18 28432.54 00:10:06.381 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 TestPT : 1.05 5950.62 23.24 0.00 0.00 21081.84 775.09 27625.94 00:10:06.381 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 raid0 : 1.06 5942.42 23.21 0.00 0.00 21064.95 1417.85 26416.05 00:10:06.381 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 concat0 : 1.06 5934.30 23.18 0.00 0.00 21026.61 1424.15 24903.68 00:10:06.381 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 raid1 : 1.06 5924.16 23.14 0.00 0.00 20985.10 2218.14 22887.19 00:10:06.381 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:10:06.381 AIO0 : 1.06 5918.03 23.12 0.00 0.00 20915.61 831.80 22887.19 00:10:06.381 =================================================================================================================== 00:10:06.381 Total : 95209.02 371.91 0.00 0.00 21199.37 513.58 35691.91 00:10:06.641 00:10:06.641 real 0m1.927s 00:10:06.641 user 0m1.615s 00:10:06.641 sys 0m0.225s 00:10:06.641 13:18:47 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:06.641 13:18:47 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:10:06.641 ************************************ 00:10:06.641 END TEST bdev_write_zeroes 00:10:06.641 ************************************ 00:10:06.641 13:18:47 blockdev_general -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:06.641 13:18:47 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:10:06.641 13:18:47 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:06.641 13:18:47 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:06.641 ************************************ 00:10:06.641 START TEST bdev_json_nonenclosed 00:10:06.641 ************************************ 00:10:06.641 13:18:47 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:06.901 [2024-07-25 13:18:47.472622] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:06.901 [2024-07-25 13:18:47.472674] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid859052 ] 00:10:06.901 [2024-07-25 13:18:47.563475] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:06.901 [2024-07-25 13:18:47.638781] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:06.901 [2024-07-25 13:18:47.638836] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:10:06.901 [2024-07-25 13:18:47.638846] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:06.901 [2024-07-25 13:18:47.638852] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:07.161 00:10:07.161 real 0m0.280s 00:10:07.161 user 0m0.174s 00:10:07.161 sys 0m0.104s 00:10:07.161 13:18:47 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:07.161 13:18:47 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:10:07.161 ************************************ 00:10:07.161 END TEST bdev_json_nonenclosed 00:10:07.161 ************************************ 00:10:07.161 13:18:47 blockdev_general -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:07.161 13:18:47 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:10:07.161 13:18:47 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:07.161 13:18:47 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.161 ************************************ 00:10:07.161 START TEST bdev_json_nonarray 00:10:07.161 ************************************ 00:10:07.161 13:18:47 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:10:07.161 [2024-07-25 13:18:47.821805] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:07.161 [2024-07-25 13:18:47.821869] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid859156 ] 00:10:07.161 [2024-07-25 13:18:47.920072] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:07.421 [2024-07-25 13:18:47.984696] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:07.421 [2024-07-25 13:18:47.984751] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:10:07.421 [2024-07-25 13:18:47.984761] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:10:07.421 [2024-07-25 13:18:47.984769] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:07.421 00:10:07.421 real 0m0.267s 00:10:07.421 user 0m0.158s 00:10:07.421 sys 0m0.107s 00:10:07.421 13:18:48 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:07.421 13:18:48 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:10:07.421 ************************************ 00:10:07.421 END TEST bdev_json_nonarray 00:10:07.421 ************************************ 00:10:07.421 13:18:48 blockdev_general -- bdev/blockdev.sh@786 -- # [[ bdev == bdev ]] 00:10:07.421 13:18:48 blockdev_general -- bdev/blockdev.sh@787 -- # run_test bdev_qos qos_test_suite '' 00:10:07.421 13:18:48 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:07.421 13:18:48 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:07.421 13:18:48 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:07.421 ************************************ 00:10:07.421 START TEST bdev_qos 00:10:07.421 ************************************ 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # qos_test_suite '' 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # QOS_PID=859184 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # echo 'Process qos testing pid: 859184' 00:10:07.421 Process qos testing pid: 859184 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@444 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # waitforlisten 859184 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@831 -- # '[' -z 859184 ']' 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:07.421 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:07.421 13:18:48 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:07.421 [2024-07-25 13:18:48.173183] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:07.421 [2024-07-25 13:18:48.173227] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid859184 ] 00:10:07.682 [2024-07-25 13:18:48.258138] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:07.682 [2024-07-25 13:18:48.347196] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@864 -- # return 0 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@450 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:08.624 Malloc_0 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # waitforbdev Malloc_0 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_0 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:08.624 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:08.885 [ 00:10:08.885 { 00:10:08.885 "name": "Malloc_0", 00:10:08.885 "aliases": [ 00:10:08.885 "a8676160-ec14-4668-ba46-8f890f1e12bf" 00:10:08.885 ], 00:10:08.885 "product_name": "Malloc disk", 00:10:08.885 "block_size": 512, 00:10:08.885 "num_blocks": 262144, 00:10:08.885 "uuid": "a8676160-ec14-4668-ba46-8f890f1e12bf", 00:10:08.885 "assigned_rate_limits": { 00:10:08.885 "rw_ios_per_sec": 0, 00:10:08.885 "rw_mbytes_per_sec": 0, 00:10:08.885 "r_mbytes_per_sec": 0, 00:10:08.885 "w_mbytes_per_sec": 0 00:10:08.885 }, 00:10:08.885 "claimed": false, 00:10:08.885 "zoned": false, 00:10:08.885 "supported_io_types": { 00:10:08.885 "read": true, 00:10:08.885 "write": true, 00:10:08.885 "unmap": true, 00:10:08.885 "flush": true, 00:10:08.885 "reset": true, 00:10:08.885 "nvme_admin": false, 00:10:08.885 "nvme_io": false, 00:10:08.885 "nvme_io_md": false, 00:10:08.885 "write_zeroes": true, 00:10:08.885 "zcopy": true, 00:10:08.885 "get_zone_info": false, 00:10:08.885 "zone_management": false, 00:10:08.885 "zone_append": false, 00:10:08.885 "compare": false, 00:10:08.885 "compare_and_write": false, 00:10:08.885 "abort": true, 00:10:08.885 "seek_hole": false, 00:10:08.885 "seek_data": false, 00:10:08.885 "copy": true, 00:10:08.885 "nvme_iov_md": false 00:10:08.885 }, 00:10:08.885 "memory_domains": [ 00:10:08.885 { 00:10:08.885 "dma_device_id": "system", 00:10:08.885 "dma_device_type": 1 00:10:08.885 }, 00:10:08.885 { 00:10:08.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:08.885 "dma_device_type": 2 00:10:08.885 } 00:10:08.885 ], 00:10:08.885 "driver_specific": {} 00:10:08.885 } 00:10:08.885 ] 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # rpc_cmd bdev_null_create Null_1 128 512 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:08.885 Null_1 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # waitforbdev Null_1 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_name=Null_1 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # local i 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:08.885 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:08.885 [ 00:10:08.885 { 00:10:08.885 "name": "Null_1", 00:10:08.885 "aliases": [ 00:10:08.885 "03b7e2ab-58ba-4dc4-9280-bccfb4259e42" 00:10:08.885 ], 00:10:08.885 "product_name": "Null disk", 00:10:08.885 "block_size": 512, 00:10:08.885 "num_blocks": 262144, 00:10:08.885 "uuid": "03b7e2ab-58ba-4dc4-9280-bccfb4259e42", 00:10:08.885 "assigned_rate_limits": { 00:10:08.885 "rw_ios_per_sec": 0, 00:10:08.885 "rw_mbytes_per_sec": 0, 00:10:08.885 "r_mbytes_per_sec": 0, 00:10:08.885 "w_mbytes_per_sec": 0 00:10:08.885 }, 00:10:08.885 "claimed": false, 00:10:08.885 "zoned": false, 00:10:08.885 "supported_io_types": { 00:10:08.885 "read": true, 00:10:08.885 "write": true, 00:10:08.885 "unmap": false, 00:10:08.885 "flush": false, 00:10:08.885 "reset": true, 00:10:08.885 "nvme_admin": false, 00:10:08.885 "nvme_io": false, 00:10:08.885 "nvme_io_md": false, 00:10:08.885 "write_zeroes": true, 00:10:08.885 "zcopy": false, 00:10:08.885 "get_zone_info": false, 00:10:08.885 "zone_management": false, 00:10:08.885 "zone_append": false, 00:10:08.885 "compare": false, 00:10:08.885 "compare_and_write": false, 00:10:08.885 "abort": true, 00:10:08.885 "seek_hole": false, 00:10:08.886 "seek_data": false, 00:10:08.886 "copy": false, 00:10:08.886 "nvme_iov_md": false 00:10:08.886 }, 00:10:08.886 "driver_specific": {} 00:10:08.886 } 00:10:08.886 ] 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- common/autotest_common.sh@907 -- # return 0 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # qos_function_test 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@409 -- # local qos_lower_iops_limit=1000 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_bw_limit=2 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@455 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local io_result=0 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local iops_limit=0 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local bw_limit=0 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # get_io_result IOPS Malloc_0 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:08.886 13:18:49 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:10:08.886 Running I/O for 60 seconds... 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 47865.48 191461.94 0.00 0.00 192512.00 0.00 0.00 ' 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # iostat_result=47865.48 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 47865 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@415 -- # io_result=47865 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@417 -- # iops_limit=11000 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # '[' 11000 -gt 1000 ']' 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@421 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 11000 Malloc_0 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # run_test bdev_qos_iops run_qos_test 11000 IOPS Malloc_0 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:14.174 13:18:54 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:14.174 ************************************ 00:10:14.174 START TEST bdev_qos_iops 00:10:14.174 ************************************ 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # run_qos_test 11000 IOPS Malloc_0 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@388 -- # local qos_limit=11000 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # get_io_result IOPS Malloc_0 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@374 -- # local limit_type=IOPS 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:14.174 13:18:54 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # tail -1 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 11000.24 44000.98 0.00 0.00 44968.00 0.00 0.00 ' 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # '[' IOPS = IOPS ']' 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # awk '{print $2}' 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # iostat_result=11000.24 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@384 -- # echo 11000 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@391 -- # qos_result=11000 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # '[' IOPS = BANDWIDTH ']' 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@395 -- # lower_limit=9900 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # upper_limit=12100 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 11000 -lt 9900 ']' 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@399 -- # '[' 11000 -gt 12100 ']' 00:10:19.465 00:10:19.465 real 0m5.253s 00:10:19.465 user 0m0.102s 00:10:19.465 sys 0m0.045s 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:19.465 13:19:00 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:10:19.465 ************************************ 00:10:19.465 END TEST bdev_qos_iops 00:10:19.465 ************************************ 00:10:19.465 13:19:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # get_io_result BANDWIDTH Null_1 00:10:19.465 13:19:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:19.465 13:19:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:10:19.465 13:19:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:19.465 13:19:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:19.465 13:19:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # grep Null_1 00:10:19.465 13:19:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # tail -1 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 17044.13 68176.50 0.00 0.00 69632.00 0.00 0.00 ' 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # iostat_result=69632.00 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@384 -- # echo 69632 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@426 -- # bw_limit=69632 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=6 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # '[' 6 -lt 2 ']' 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@431 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 6 Null_1 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # run_test bdev_qos_bw run_qos_test 6 BANDWIDTH Null_1 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:24.751 13:19:05 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:24.751 ************************************ 00:10:24.751 START TEST bdev_qos_bw 00:10:24.751 ************************************ 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # run_qos_test 6 BANDWIDTH Null_1 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@388 -- # local qos_limit=6 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Null_1 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Null_1 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # grep Null_1 00:10:24.751 13:19:05 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # tail -1 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # iostat_result='Null_1 1536.26 6145.05 0.00 0.00 6364.00 0.00 0.00 ' 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # iostat_result=6364.00 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@384 -- # echo 6364 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@391 -- # qos_result=6364 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # qos_limit=6144 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@395 -- # lower_limit=5529 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # upper_limit=6758 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 6364 -lt 5529 ']' 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@399 -- # '[' 6364 -gt 6758 ']' 00:10:30.090 00:10:30.090 real 0m5.309s 00:10:30.090 user 0m0.108s 00:10:30.090 sys 0m0.043s 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:30.090 13:19:10 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:10:30.090 ************************************ 00:10:30.090 END TEST bdev_qos_bw 00:10:30.090 ************************************ 00:10:30.090 13:19:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@435 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:10:30.090 13:19:10 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:30.090 13:19:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:30.350 13:19:10 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:30.350 13:19:10 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:10:30.350 13:19:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:10:30.350 13:19:10 blockdev_general.bdev_qos -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:30.350 13:19:10 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:30.350 ************************************ 00:10:30.350 START TEST bdev_qos_ro_bw 00:10:30.350 ************************************ 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@388 -- # local qos_limit=2 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_result=0 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # get_io_result BANDWIDTH Malloc_0 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@374 -- # local limit_type=BANDWIDTH 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local qos_dev=Malloc_0 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local iostat_result 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # grep Malloc_0 00:10:30.350 13:19:10 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # tail -1 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # iostat_result='Malloc_0 512.09 2048.35 0.00 0.00 2056.00 0.00 0.00 ' 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # '[' BANDWIDTH = IOPS ']' 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@380 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # awk '{print $6}' 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # iostat_result=2056.00 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@384 -- # echo 2056 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@391 -- # qos_result=2056 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # qos_limit=2048 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@395 -- # lower_limit=1843 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # upper_limit=2252 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2056 -lt 1843 ']' 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@399 -- # '[' 2056 -gt 2252 ']' 00:10:35.636 00:10:35.636 real 0m5.185s 00:10:35.636 user 0m0.112s 00:10:35.636 sys 0m0.044s 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:35.636 13:19:16 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:10:35.636 ************************************ 00:10:35.636 END TEST bdev_qos_ro_bw 00:10:35.636 ************************************ 00:10:35.636 13:19:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@458 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:10:35.636 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:35.636 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_null_delete Null_1 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:36.206 00:10:36.206 Latency(us) 00:10:36.206 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:36.206 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:36.206 Malloc_0 : 27.03 15848.59 61.91 0.00 0.00 15996.34 2571.03 503316.48 00:10:36.206 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:36.206 Null_1 : 27.22 16450.63 64.26 0.00 0.00 15503.11 1058.66 201649.23 00:10:36.206 =================================================================================================================== 00:10:36.206 Total : 32299.22 126.17 0.00 0.00 15744.27 1058.66 503316.48 00:10:36.206 0 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # killprocess 859184 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@950 -- # '[' -z 859184 ']' 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # kill -0 859184 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # uname 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 859184 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # echo 'killing process with pid 859184' 00:10:36.206 killing process with pid 859184 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@969 -- # kill 859184 00:10:36.206 Received shutdown signal, test time was about 27.292528 seconds 00:10:36.206 00:10:36.206 Latency(us) 00:10:36.206 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:36.206 =================================================================================================================== 00:10:36.206 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:36.206 13:19:16 blockdev_general.bdev_qos -- common/autotest_common.sh@974 -- # wait 859184 00:10:36.467 13:19:17 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # trap - SIGINT SIGTERM EXIT 00:10:36.467 00:10:36.467 real 0m29.001s 00:10:36.467 user 0m30.143s 00:10:36.467 sys 0m0.777s 00:10:36.467 13:19:17 blockdev_general.bdev_qos -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:36.467 13:19:17 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:10:36.467 ************************************ 00:10:36.467 END TEST bdev_qos 00:10:36.467 ************************************ 00:10:36.467 13:19:17 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:10:36.467 13:19:17 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:36.467 13:19:17 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:36.467 13:19:17 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:36.467 ************************************ 00:10:36.467 START TEST bdev_qd_sampling 00:10:36.467 ************************************ 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # qd_sampling_test_suite '' 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@537 -- # QD_DEV=Malloc_QD 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # QD_PID=863943 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # echo 'Process bdev QD sampling period testing pid: 863943' 00:10:36.467 Process bdev QD sampling period testing pid: 863943 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # waitforlisten 863943 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@831 -- # '[' -z 863943 ']' 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:36.467 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:36.467 13:19:17 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:36.727 [2024-07-25 13:19:17.262283] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:36.727 [2024-07-25 13:19:17.262340] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid863943 ] 00:10:36.727 [2024-07-25 13:19:17.354837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:36.727 [2024-07-25 13:19:17.450585] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:36.727 [2024-07-25 13:19:17.450595] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:37.685 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:37.685 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@864 -- # return 0 00:10:37.685 13:19:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@545 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:10:37.685 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:37.685 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:37.685 Malloc_QD 00:10:37.685 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:37.685 13:19:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # waitforbdev Malloc_QD 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_QD 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # local i 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:37.686 [ 00:10:37.686 { 00:10:37.686 "name": "Malloc_QD", 00:10:37.686 "aliases": [ 00:10:37.686 "c1eb17b5-6fcd-4c0b-a0e1-8578a3f91844" 00:10:37.686 ], 00:10:37.686 "product_name": "Malloc disk", 00:10:37.686 "block_size": 512, 00:10:37.686 "num_blocks": 262144, 00:10:37.686 "uuid": "c1eb17b5-6fcd-4c0b-a0e1-8578a3f91844", 00:10:37.686 "assigned_rate_limits": { 00:10:37.686 "rw_ios_per_sec": 0, 00:10:37.686 "rw_mbytes_per_sec": 0, 00:10:37.686 "r_mbytes_per_sec": 0, 00:10:37.686 "w_mbytes_per_sec": 0 00:10:37.686 }, 00:10:37.686 "claimed": false, 00:10:37.686 "zoned": false, 00:10:37.686 "supported_io_types": { 00:10:37.686 "read": true, 00:10:37.686 "write": true, 00:10:37.686 "unmap": true, 00:10:37.686 "flush": true, 00:10:37.686 "reset": true, 00:10:37.686 "nvme_admin": false, 00:10:37.686 "nvme_io": false, 00:10:37.686 "nvme_io_md": false, 00:10:37.686 "write_zeroes": true, 00:10:37.686 "zcopy": true, 00:10:37.686 "get_zone_info": false, 00:10:37.686 "zone_management": false, 00:10:37.686 "zone_append": false, 00:10:37.686 "compare": false, 00:10:37.686 "compare_and_write": false, 00:10:37.686 "abort": true, 00:10:37.686 "seek_hole": false, 00:10:37.686 "seek_data": false, 00:10:37.686 "copy": true, 00:10:37.686 "nvme_iov_md": false 00:10:37.686 }, 00:10:37.686 "memory_domains": [ 00:10:37.686 { 00:10:37.686 "dma_device_id": "system", 00:10:37.686 "dma_device_type": 1 00:10:37.686 }, 00:10:37.686 { 00:10:37.686 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.686 "dma_device_type": 2 00:10:37.686 } 00:10:37.686 ], 00:10:37.686 "driver_specific": {} 00:10:37.686 } 00:10:37.686 ] 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@907 -- # return 0 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # sleep 2 00:10:37.686 13:19:18 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:37.686 Running I/O for 5 seconds... 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # qd_sampling_function_test Malloc_QD 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@518 -- # local bdev_name=Malloc_QD 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local sampling_period=10 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local iostats 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@522 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:39.599 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@524 -- # iostats='{ 00:10:39.599 "tick_rate": 2600000000, 00:10:39.599 "ticks": 14831173754326873, 00:10:39.599 "bdevs": [ 00:10:39.599 { 00:10:39.599 "name": "Malloc_QD", 00:10:39.599 "bytes_read": 993047040, 00:10:39.599 "num_read_ops": 242436, 00:10:39.599 "bytes_written": 0, 00:10:39.599 "num_write_ops": 0, 00:10:39.599 "bytes_unmapped": 0, 00:10:39.599 "num_unmap_ops": 0, 00:10:39.599 "bytes_copied": 0, 00:10:39.599 "num_copy_ops": 0, 00:10:39.599 "read_latency_ticks": 2567141205326, 00:10:39.599 "max_read_latency_ticks": 15683818, 00:10:39.599 "min_read_latency_ticks": 307624, 00:10:39.599 "write_latency_ticks": 0, 00:10:39.599 "max_write_latency_ticks": 0, 00:10:39.599 "min_write_latency_ticks": 0, 00:10:39.599 "unmap_latency_ticks": 0, 00:10:39.600 "max_unmap_latency_ticks": 0, 00:10:39.600 "min_unmap_latency_ticks": 0, 00:10:39.600 "copy_latency_ticks": 0, 00:10:39.600 "max_copy_latency_ticks": 0, 00:10:39.600 "min_copy_latency_ticks": 0, 00:10:39.600 "io_error": {}, 00:10:39.600 "queue_depth_polling_period": 10, 00:10:39.600 "queue_depth": 768, 00:10:39.600 "io_time": 60, 00:10:39.600 "weighted_io_time": 40960 00:10:39.600 } 00:10:39.600 ] 00:10:39.600 }' 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@526 -- # qd_sampling_period=10 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 == null ']' 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@528 -- # '[' 10 -ne 10 ']' 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@552 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:39.600 00:10:39.600 Latency(us) 00:10:39.600 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:39.600 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:39.600 Malloc_QD : 2.00 71949.31 281.05 0.00 0.00 3550.46 989.34 4032.98 00:10:39.600 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:39.600 Malloc_QD : 2.01 53583.18 209.31 0.00 0.00 4765.92 1291.82 6049.48 00:10:39.600 =================================================================================================================== 00:10:39.600 Total : 125532.49 490.36 0.00 0.00 4069.78 989.34 6049.48 00:10:39.600 0 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # killprocess 863943 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@950 -- # '[' -z 863943 ']' 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # kill -0 863943 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # uname 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:39.600 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 863943 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # echo 'killing process with pid 863943' 00:10:39.860 killing process with pid 863943 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@969 -- # kill 863943 00:10:39.860 Received shutdown signal, test time was about 2.085977 seconds 00:10:39.860 00:10:39.860 Latency(us) 00:10:39.860 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:39.860 =================================================================================================================== 00:10:39.860 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@974 -- # wait 863943 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # trap - SIGINT SIGTERM EXIT 00:10:39.860 00:10:39.860 real 0m3.355s 00:10:39.860 user 0m6.637s 00:10:39.860 sys 0m0.376s 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:39.860 13:19:20 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:10:39.860 ************************************ 00:10:39.860 END TEST bdev_qd_sampling 00:10:39.860 ************************************ 00:10:39.860 13:19:20 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_error error_test_suite '' 00:10:39.861 13:19:20 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:39.861 13:19:20 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:39.861 13:19:20 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:39.861 ************************************ 00:10:39.861 START TEST bdev_error 00:10:39.861 ************************************ 00:10:39.861 13:19:20 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # error_test_suite '' 00:10:39.861 13:19:20 blockdev_general.bdev_error -- bdev/blockdev.sh@465 -- # DEV_1=Dev_1 00:10:39.861 13:19:20 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_2=Dev_2 00:10:39.861 13:19:20 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # ERR_DEV=EE_Dev_1 00:10:39.861 13:19:20 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # ERR_PID=864527 00:10:39.861 13:19:20 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # echo 'Process error testing pid: 864527' 00:10:39.861 Process error testing pid: 864527 00:10:39.861 13:19:20 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # waitforlisten 864527 00:10:39.861 13:19:20 blockdev_general.bdev_error -- bdev/blockdev.sh@470 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:10:39.861 13:19:20 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 864527 ']' 00:10:39.861 13:19:20 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:39.861 13:19:20 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:39.861 13:19:20 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:39.861 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:39.861 13:19:20 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:39.861 13:19:20 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:40.121 [2024-07-25 13:19:20.697372] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:40.121 [2024-07-25 13:19:20.697427] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid864527 ] 00:10:40.121 [2024-07-25 13:19:20.790323] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.121 [2024-07-25 13:19:20.901013] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:10:41.062 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@475 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 Dev_1 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # waitforbdev Dev_1 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 [ 00:10:41.062 { 00:10:41.062 "name": "Dev_1", 00:10:41.062 "aliases": [ 00:10:41.062 "087f347c-de35-4ed4-bb2a-786bed55d6e6" 00:10:41.062 ], 00:10:41.062 "product_name": "Malloc disk", 00:10:41.062 "block_size": 512, 00:10:41.062 "num_blocks": 262144, 00:10:41.062 "uuid": "087f347c-de35-4ed4-bb2a-786bed55d6e6", 00:10:41.062 "assigned_rate_limits": { 00:10:41.062 "rw_ios_per_sec": 0, 00:10:41.062 "rw_mbytes_per_sec": 0, 00:10:41.062 "r_mbytes_per_sec": 0, 00:10:41.062 "w_mbytes_per_sec": 0 00:10:41.062 }, 00:10:41.062 "claimed": false, 00:10:41.062 "zoned": false, 00:10:41.062 "supported_io_types": { 00:10:41.062 "read": true, 00:10:41.062 "write": true, 00:10:41.062 "unmap": true, 00:10:41.062 "flush": true, 00:10:41.062 "reset": true, 00:10:41.062 "nvme_admin": false, 00:10:41.062 "nvme_io": false, 00:10:41.062 "nvme_io_md": false, 00:10:41.062 "write_zeroes": true, 00:10:41.062 "zcopy": true, 00:10:41.062 "get_zone_info": false, 00:10:41.062 "zone_management": false, 00:10:41.062 "zone_append": false, 00:10:41.062 "compare": false, 00:10:41.062 "compare_and_write": false, 00:10:41.062 "abort": true, 00:10:41.062 "seek_hole": false, 00:10:41.062 "seek_data": false, 00:10:41.062 "copy": true, 00:10:41.062 "nvme_iov_md": false 00:10:41.062 }, 00:10:41.062 "memory_domains": [ 00:10:41.062 { 00:10:41.062 "dma_device_id": "system", 00:10:41.062 "dma_device_type": 1 00:10:41.062 }, 00:10:41.062 { 00:10:41.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.062 "dma_device_type": 2 00:10:41.062 } 00:10:41.062 ], 00:10:41.062 "driver_specific": {} 00:10:41.062 } 00:10:41.062 ] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:41.062 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # rpc_cmd bdev_error_create Dev_1 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 true 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 Dev_2 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # waitforbdev Dev_2 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 [ 00:10:41.062 { 00:10:41.062 "name": "Dev_2", 00:10:41.062 "aliases": [ 00:10:41.062 "c4228fb8-2ba1-46ad-ba9d-7959119ebb45" 00:10:41.062 ], 00:10:41.062 "product_name": "Malloc disk", 00:10:41.062 "block_size": 512, 00:10:41.062 "num_blocks": 262144, 00:10:41.062 "uuid": "c4228fb8-2ba1-46ad-ba9d-7959119ebb45", 00:10:41.062 "assigned_rate_limits": { 00:10:41.062 "rw_ios_per_sec": 0, 00:10:41.062 "rw_mbytes_per_sec": 0, 00:10:41.062 "r_mbytes_per_sec": 0, 00:10:41.062 "w_mbytes_per_sec": 0 00:10:41.062 }, 00:10:41.062 "claimed": false, 00:10:41.062 "zoned": false, 00:10:41.062 "supported_io_types": { 00:10:41.062 "read": true, 00:10:41.062 "write": true, 00:10:41.062 "unmap": true, 00:10:41.062 "flush": true, 00:10:41.062 "reset": true, 00:10:41.062 "nvme_admin": false, 00:10:41.062 "nvme_io": false, 00:10:41.062 "nvme_io_md": false, 00:10:41.062 "write_zeroes": true, 00:10:41.062 "zcopy": true, 00:10:41.062 "get_zone_info": false, 00:10:41.062 "zone_management": false, 00:10:41.062 "zone_append": false, 00:10:41.062 "compare": false, 00:10:41.062 "compare_and_write": false, 00:10:41.062 "abort": true, 00:10:41.062 "seek_hole": false, 00:10:41.062 "seek_data": false, 00:10:41.062 "copy": true, 00:10:41.062 "nvme_iov_md": false 00:10:41.062 }, 00:10:41.062 "memory_domains": [ 00:10:41.062 { 00:10:41.062 "dma_device_id": "system", 00:10:41.062 "dma_device_type": 1 00:10:41.062 }, 00:10:41.062 { 00:10:41.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:41.062 "dma_device_type": 2 00:10:41.062 } 00:10:41.062 ], 00:10:41.062 "driver_specific": {} 00:10:41.062 } 00:10:41.062 ] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:41.062 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:41.062 13:19:21 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:41.062 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # sleep 1 00:10:41.063 13:19:21 blockdev_general.bdev_error -- bdev/blockdev.sh@482 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:41.063 Running I/O for 5 seconds... 00:10:42.001 13:19:22 blockdev_general.bdev_error -- bdev/blockdev.sh@486 -- # kill -0 864527 00:10:42.001 13:19:22 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # echo 'Process is existed as continue on error is set. Pid: 864527' 00:10:42.001 Process is existed as continue on error is set. Pid: 864527 00:10:42.001 13:19:22 blockdev_general.bdev_error -- bdev/blockdev.sh@494 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:10:42.001 13:19:22 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.001 13:19:22 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:42.002 13:19:22 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.002 13:19:22 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_malloc_delete Dev_1 00:10:42.002 13:19:22 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:42.002 13:19:22 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:42.002 13:19:22 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:42.002 13:19:22 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # sleep 5 00:10:42.261 Timeout while waiting for response: 00:10:42.261 00:10:42.261 00:10:46.461 00:10:46.461 Latency(us) 00:10:46.461 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:46.461 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:46.461 EE_Dev_1 : 0.91 28708.12 112.14 5.50 0.00 552.66 189.83 894.82 00:10:46.461 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:46.461 Dev_2 : 5.00 62496.43 244.13 0.00 0.00 251.51 86.25 19156.68 00:10:46.462 =================================================================================================================== 00:10:46.462 Total : 91204.54 356.27 5.50 0.00 274.72 86.25 19156.68 00:10:47.032 13:19:27 blockdev_general.bdev_error -- bdev/blockdev.sh@498 -- # killprocess 864527 00:10:47.032 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@950 -- # '[' -z 864527 ']' 00:10:47.032 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # kill -0 864527 00:10:47.032 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # uname 00:10:47.032 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:47.032 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 864527 00:10:47.292 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:10:47.292 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:10:47.292 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # echo 'killing process with pid 864527' 00:10:47.292 killing process with pid 864527 00:10:47.292 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@969 -- # kill 864527 00:10:47.292 Received shutdown signal, test time was about 5.000000 seconds 00:10:47.292 00:10:47.292 Latency(us) 00:10:47.292 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:47.292 =================================================================================================================== 00:10:47.292 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:47.292 13:19:27 blockdev_general.bdev_error -- common/autotest_common.sh@974 -- # wait 864527 00:10:47.292 13:19:28 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # ERR_PID=865747 00:10:47.292 13:19:28 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # echo 'Process error testing pid: 865747' 00:10:47.292 Process error testing pid: 865747 00:10:47.292 13:19:28 blockdev_general.bdev_error -- bdev/blockdev.sh@501 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:10:47.292 13:19:28 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # waitforlisten 865747 00:10:47.292 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@831 -- # '[' -z 865747 ']' 00:10:47.292 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:47.292 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:47.292 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:47.292 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:47.292 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:47.292 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:47.552 [2024-07-25 13:19:28.103319] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:47.552 [2024-07-25 13:19:28.103388] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid865747 ] 00:10:47.552 [2024-07-25 13:19:28.194999] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:47.552 [2024-07-25 13:19:28.303106] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:48.495 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:48.495 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@864 -- # return 0 00:10:48.495 13:19:28 blockdev_general.bdev_error -- bdev/blockdev.sh@506 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 Dev_1 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:28 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # waitforbdev Dev_1 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_1 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:28 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 [ 00:10:48.496 { 00:10:48.496 "name": "Dev_1", 00:10:48.496 "aliases": [ 00:10:48.496 "9b469425-ac00-493d-9f73-35ce206798bd" 00:10:48.496 ], 00:10:48.496 "product_name": "Malloc disk", 00:10:48.496 "block_size": 512, 00:10:48.496 "num_blocks": 262144, 00:10:48.496 "uuid": "9b469425-ac00-493d-9f73-35ce206798bd", 00:10:48.496 "assigned_rate_limits": { 00:10:48.496 "rw_ios_per_sec": 0, 00:10:48.496 "rw_mbytes_per_sec": 0, 00:10:48.496 "r_mbytes_per_sec": 0, 00:10:48.496 "w_mbytes_per_sec": 0 00:10:48.496 }, 00:10:48.496 "claimed": false, 00:10:48.496 "zoned": false, 00:10:48.496 "supported_io_types": { 00:10:48.496 "read": true, 00:10:48.496 "write": true, 00:10:48.496 "unmap": true, 00:10:48.496 "flush": true, 00:10:48.496 "reset": true, 00:10:48.496 "nvme_admin": false, 00:10:48.496 "nvme_io": false, 00:10:48.496 "nvme_io_md": false, 00:10:48.496 "write_zeroes": true, 00:10:48.496 "zcopy": true, 00:10:48.496 "get_zone_info": false, 00:10:48.496 "zone_management": false, 00:10:48.496 "zone_append": false, 00:10:48.496 "compare": false, 00:10:48.496 "compare_and_write": false, 00:10:48.496 "abort": true, 00:10:48.496 "seek_hole": false, 00:10:48.496 "seek_data": false, 00:10:48.496 "copy": true, 00:10:48.496 "nvme_iov_md": false 00:10:48.496 }, 00:10:48.496 "memory_domains": [ 00:10:48.496 { 00:10:48.496 "dma_device_id": "system", 00:10:48.496 "dma_device_type": 1 00:10:48.496 }, 00:10:48.496 { 00:10:48.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.496 "dma_device_type": 2 00:10:48.496 } 00:10:48.496 ], 00:10:48.496 "driver_specific": {} 00:10:48.496 } 00:10:48.496 ] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:48.496 13:19:29 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # rpc_cmd bdev_error_create Dev_1 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 true 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 Dev_2 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # waitforbdev Dev_2 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_name=Dev_2 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # local i 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 [ 00:10:48.496 { 00:10:48.496 "name": "Dev_2", 00:10:48.496 "aliases": [ 00:10:48.496 "f5855f69-e627-48d1-927e-9a147a3538a6" 00:10:48.496 ], 00:10:48.496 "product_name": "Malloc disk", 00:10:48.496 "block_size": 512, 00:10:48.496 "num_blocks": 262144, 00:10:48.496 "uuid": "f5855f69-e627-48d1-927e-9a147a3538a6", 00:10:48.496 "assigned_rate_limits": { 00:10:48.496 "rw_ios_per_sec": 0, 00:10:48.496 "rw_mbytes_per_sec": 0, 00:10:48.496 "r_mbytes_per_sec": 0, 00:10:48.496 "w_mbytes_per_sec": 0 00:10:48.496 }, 00:10:48.496 "claimed": false, 00:10:48.496 "zoned": false, 00:10:48.496 "supported_io_types": { 00:10:48.496 "read": true, 00:10:48.496 "write": true, 00:10:48.496 "unmap": true, 00:10:48.496 "flush": true, 00:10:48.496 "reset": true, 00:10:48.496 "nvme_admin": false, 00:10:48.496 "nvme_io": false, 00:10:48.496 "nvme_io_md": false, 00:10:48.496 "write_zeroes": true, 00:10:48.496 "zcopy": true, 00:10:48.496 "get_zone_info": false, 00:10:48.496 "zone_management": false, 00:10:48.496 "zone_append": false, 00:10:48.496 "compare": false, 00:10:48.496 "compare_and_write": false, 00:10:48.496 "abort": true, 00:10:48.496 "seek_hole": false, 00:10:48.496 "seek_data": false, 00:10:48.496 "copy": true, 00:10:48.496 "nvme_iov_md": false 00:10:48.496 }, 00:10:48.496 "memory_domains": [ 00:10:48.496 { 00:10:48.496 "dma_device_id": "system", 00:10:48.496 "dma_device_type": 1 00:10:48.496 }, 00:10:48.496 { 00:10:48.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:48.496 "dma_device_type": 2 00:10:48.496 } 00:10:48.496 ], 00:10:48.496 "driver_specific": {} 00:10:48.496 } 00:10:48.496 ] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@907 -- # return 0 00:10:48.496 13:19:29 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:48.496 13:19:29 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # NOT wait 865747 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@650 -- # local es=0 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # valid_exec_arg wait 865747 00:10:48.496 13:19:29 blockdev_general.bdev_error -- bdev/blockdev.sh@513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@638 -- # local arg=wait 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # type -t wait 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:10:48.496 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # wait 865747 00:10:48.496 Running I/O for 5 seconds... 00:10:48.496 task offset: 70832 on job bdev=EE_Dev_1 fails 00:10:48.496 00:10:48.496 Latency(us) 00:10:48.496 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:48.496 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:48.496 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:10:48.496 EE_Dev_1 : 0.00 22088.35 86.28 5020.08 0.00 493.82 179.59 882.22 00:10:48.496 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:10:48.496 Dev_2 : 0.00 13822.89 54.00 0.00 0.00 863.61 171.72 1606.89 00:10:48.496 =================================================================================================================== 00:10:48.496 Total : 35911.25 140.28 5020.08 0.00 694.38 171.72 1606.89 00:10:48.496 [2024-07-25 13:19:29.223297] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:10:48.496 request: 00:10:48.496 { 00:10:48.496 "method": "perform_tests", 00:10:48.496 "req_id": 1 00:10:48.496 } 00:10:48.496 Got JSON-RPC error response 00:10:48.496 response: 00:10:48.497 { 00:10:48.497 "code": -32603, 00:10:48.497 "message": "bdevperf failed with error Operation not permitted" 00:10:48.497 } 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@653 -- # es=255 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # es=127 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@663 -- # case "$es" in 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@670 -- # es=1 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:10:48.758 00:10:48.758 real 0m8.807s 00:10:48.758 user 0m9.132s 00:10:48.758 sys 0m0.761s 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:48.758 13:19:29 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:10:48.758 ************************************ 00:10:48.758 END TEST bdev_error 00:10:48.758 ************************************ 00:10:48.758 13:19:29 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_stat stat_test_suite '' 00:10:48.758 13:19:29 blockdev_general -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:48.758 13:19:29 blockdev_general -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:48.758 13:19:29 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:48.758 ************************************ 00:10:48.758 START TEST bdev_stat 00:10:48.758 ************************************ 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # stat_test_suite '' 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@591 -- # STAT_DEV=Malloc_STAT 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # STAT_PID=866058 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # echo 'Process Bdev IO statistics testing pid: 866058' 00:10:48.758 Process Bdev IO statistics testing pid: 866058 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@594 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # waitforlisten 866058 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- common/autotest_common.sh@831 -- # '[' -z 866058 ']' 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:10:48.758 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:48.758 13:19:29 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:49.020 [2024-07-25 13:19:29.580299] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:49.020 [2024-07-25 13:19:29.580355] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid866058 ] 00:10:49.020 [2024-07-25 13:19:29.673308] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:10:49.021 [2024-07-25 13:19:29.770591] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:10:49.021 [2024-07-25 13:19:29.770596] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:49.963 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@864 -- # return 0 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- bdev/blockdev.sh@600 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:49.964 Malloc_STAT 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # waitforbdev Malloc_STAT 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_name=Malloc_STAT 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # local i 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:49.964 [ 00:10:49.964 { 00:10:49.964 "name": "Malloc_STAT", 00:10:49.964 "aliases": [ 00:10:49.964 "84496866-9afe-4216-a520-e9eafe874d62" 00:10:49.964 ], 00:10:49.964 "product_name": "Malloc disk", 00:10:49.964 "block_size": 512, 00:10:49.964 "num_blocks": 262144, 00:10:49.964 "uuid": "84496866-9afe-4216-a520-e9eafe874d62", 00:10:49.964 "assigned_rate_limits": { 00:10:49.964 "rw_ios_per_sec": 0, 00:10:49.964 "rw_mbytes_per_sec": 0, 00:10:49.964 "r_mbytes_per_sec": 0, 00:10:49.964 "w_mbytes_per_sec": 0 00:10:49.964 }, 00:10:49.964 "claimed": false, 00:10:49.964 "zoned": false, 00:10:49.964 "supported_io_types": { 00:10:49.964 "read": true, 00:10:49.964 "write": true, 00:10:49.964 "unmap": true, 00:10:49.964 "flush": true, 00:10:49.964 "reset": true, 00:10:49.964 "nvme_admin": false, 00:10:49.964 "nvme_io": false, 00:10:49.964 "nvme_io_md": false, 00:10:49.964 "write_zeroes": true, 00:10:49.964 "zcopy": true, 00:10:49.964 "get_zone_info": false, 00:10:49.964 "zone_management": false, 00:10:49.964 "zone_append": false, 00:10:49.964 "compare": false, 00:10:49.964 "compare_and_write": false, 00:10:49.964 "abort": true, 00:10:49.964 "seek_hole": false, 00:10:49.964 "seek_data": false, 00:10:49.964 "copy": true, 00:10:49.964 "nvme_iov_md": false 00:10:49.964 }, 00:10:49.964 "memory_domains": [ 00:10:49.964 { 00:10:49.964 "dma_device_id": "system", 00:10:49.964 "dma_device_type": 1 00:10:49.964 }, 00:10:49.964 { 00:10:49.964 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.964 "dma_device_type": 2 00:10:49.964 } 00:10:49.964 ], 00:10:49.964 "driver_specific": {} 00:10:49.964 } 00:10:49.964 ] 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- common/autotest_common.sh@907 -- # return 0 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # sleep 2 00:10:49.964 13:19:30 blockdev_general.bdev_stat -- bdev/blockdev.sh@603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:10:49.964 Running I/O for 10 seconds... 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # stat_function_test Malloc_STAT 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@558 -- # local bdev_name=Malloc_STAT 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local iostats 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local io_count1 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count2 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local iostats_per_channel 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local io_count_per_channel1 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel2 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel_all=0 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@567 -- # iostats='{ 00:10:51.880 "tick_rate": 2600000000, 00:10:51.880 "ticks": 14831205647294755, 00:10:51.880 "bdevs": [ 00:10:51.880 { 00:10:51.880 "name": "Malloc_STAT", 00:10:51.880 "bytes_read": 984658432, 00:10:51.880 "num_read_ops": 240388, 00:10:51.880 "bytes_written": 0, 00:10:51.880 "num_write_ops": 0, 00:10:51.880 "bytes_unmapped": 0, 00:10:51.880 "num_unmap_ops": 0, 00:10:51.880 "bytes_copied": 0, 00:10:51.880 "num_copy_ops": 0, 00:10:51.880 "read_latency_ticks": 2539459131704, 00:10:51.880 "max_read_latency_ticks": 15886982, 00:10:51.880 "min_read_latency_ticks": 316386, 00:10:51.880 "write_latency_ticks": 0, 00:10:51.880 "max_write_latency_ticks": 0, 00:10:51.880 "min_write_latency_ticks": 0, 00:10:51.880 "unmap_latency_ticks": 0, 00:10:51.880 "max_unmap_latency_ticks": 0, 00:10:51.880 "min_unmap_latency_ticks": 0, 00:10:51.880 "copy_latency_ticks": 0, 00:10:51.880 "max_copy_latency_ticks": 0, 00:10:51.880 "min_copy_latency_ticks": 0, 00:10:51.880 "io_error": {} 00:10:51.880 } 00:10:51.880 ] 00:10:51.880 }' 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # jq -r '.bdevs[0].num_read_ops' 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # io_count1=240388 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@570 -- # iostats_per_channel='{ 00:10:51.880 "tick_rate": 2600000000, 00:10:51.880 "ticks": 14831205832412403, 00:10:51.880 "name": "Malloc_STAT", 00:10:51.880 "channels": [ 00:10:51.880 { 00:10:51.880 "thread_id": 2, 00:10:51.880 "bytes_read": 588251136, 00:10:51.880 "num_read_ops": 143616, 00:10:51.880 "bytes_written": 0, 00:10:51.880 "num_write_ops": 0, 00:10:51.880 "bytes_unmapped": 0, 00:10:51.880 "num_unmap_ops": 0, 00:10:51.880 "bytes_copied": 0, 00:10:51.880 "num_copy_ops": 0, 00:10:51.880 "read_latency_ticks": 1316682470104, 00:10:51.880 "max_read_latency_ticks": 9861462, 00:10:51.880 "min_read_latency_ticks": 7318424, 00:10:51.880 "write_latency_ticks": 0, 00:10:51.880 "max_write_latency_ticks": 0, 00:10:51.880 "min_write_latency_ticks": 0, 00:10:51.880 "unmap_latency_ticks": 0, 00:10:51.880 "max_unmap_latency_ticks": 0, 00:10:51.880 "min_unmap_latency_ticks": 0, 00:10:51.880 "copy_latency_ticks": 0, 00:10:51.880 "max_copy_latency_ticks": 0, 00:10:51.880 "min_copy_latency_ticks": 0 00:10:51.880 }, 00:10:51.880 { 00:10:51.880 "thread_id": 3, 00:10:51.880 "bytes_read": 433061888, 00:10:51.880 "num_read_ops": 105728, 00:10:51.880 "bytes_written": 0, 00:10:51.880 "num_write_ops": 0, 00:10:51.880 "bytes_unmapped": 0, 00:10:51.880 "num_unmap_ops": 0, 00:10:51.880 "bytes_copied": 0, 00:10:51.880 "num_copy_ops": 0, 00:10:51.880 "read_latency_ticks": 1318100282124, 00:10:51.880 "max_read_latency_ticks": 15886982, 00:10:51.880 "min_read_latency_ticks": 9543520, 00:10:51.880 "write_latency_ticks": 0, 00:10:51.880 "max_write_latency_ticks": 0, 00:10:51.880 "min_write_latency_ticks": 0, 00:10:51.880 "unmap_latency_ticks": 0, 00:10:51.880 "max_unmap_latency_ticks": 0, 00:10:51.880 "min_unmap_latency_ticks": 0, 00:10:51.880 "copy_latency_ticks": 0, 00:10:51.880 "max_copy_latency_ticks": 0, 00:10:51.880 "min_copy_latency_ticks": 0 00:10:51.880 } 00:10:51.880 ] 00:10:51.880 }' 00:10:51.880 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # jq -r '.channels[0].num_read_ops' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # io_count_per_channel1=143616 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel_all=143616 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # jq -r '.channels[1].num_read_ops' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel2=105728 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel_all=249344 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@576 -- # iostats='{ 00:10:52.142 "tick_rate": 2600000000, 00:10:52.142 "ticks": 14831206091968193, 00:10:52.142 "bdevs": [ 00:10:52.142 { 00:10:52.142 "name": "Malloc_STAT", 00:10:52.142 "bytes_read": 1072738816, 00:10:52.142 "num_read_ops": 261892, 00:10:52.142 "bytes_written": 0, 00:10:52.142 "num_write_ops": 0, 00:10:52.142 "bytes_unmapped": 0, 00:10:52.142 "num_unmap_ops": 0, 00:10:52.142 "bytes_copied": 0, 00:10:52.142 "num_copy_ops": 0, 00:10:52.142 "read_latency_ticks": 2767712004116, 00:10:52.142 "max_read_latency_ticks": 15886982, 00:10:52.142 "min_read_latency_ticks": 316386, 00:10:52.142 "write_latency_ticks": 0, 00:10:52.142 "max_write_latency_ticks": 0, 00:10:52.142 "min_write_latency_ticks": 0, 00:10:52.142 "unmap_latency_ticks": 0, 00:10:52.142 "max_unmap_latency_ticks": 0, 00:10:52.142 "min_unmap_latency_ticks": 0, 00:10:52.142 "copy_latency_ticks": 0, 00:10:52.142 "max_copy_latency_ticks": 0, 00:10:52.142 "min_copy_latency_ticks": 0, 00:10:52.142 "io_error": {} 00:10:52.142 } 00:10:52.142 ] 00:10:52.142 }' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # jq -r '.bdevs[0].num_read_ops' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # io_count2=261892 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 249344 -lt 240388 ']' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@582 -- # '[' 249344 -gt 261892 ']' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@607 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@561 -- # xtrace_disable 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:52.142 00:10:52.142 Latency(us) 00:10:52.142 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:52.142 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:10:52.142 Malloc_STAT : 2.19 72436.68 282.96 0.00 0.00 3526.82 1052.36 3806.13 00:10:52.142 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:10:52.142 Malloc_STAT : 2.19 53333.31 208.33 0.00 0.00 4788.64 1272.91 6125.10 00:10:52.142 =================================================================================================================== 00:10:52.142 Total : 125769.99 491.29 0.00 0.00 4062.24 1052.36 6125.10 00:10:52.142 0 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # killprocess 866058 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@950 -- # '[' -z 866058 ']' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # kill -0 866058 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # uname 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 866058 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 866058' 00:10:52.142 killing process with pid 866058 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@969 -- # kill 866058 00:10:52.142 Received shutdown signal, test time was about 2.277073 seconds 00:10:52.142 00:10:52.142 Latency(us) 00:10:52.142 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:10:52.142 =================================================================================================================== 00:10:52.142 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:10:52.142 13:19:32 blockdev_general.bdev_stat -- common/autotest_common.sh@974 -- # wait 866058 00:10:52.403 13:19:33 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # trap - SIGINT SIGTERM EXIT 00:10:52.403 00:10:52.403 real 0m3.545s 00:10:52.403 user 0m7.169s 00:10:52.403 sys 0m0.395s 00:10:52.403 13:19:33 blockdev_general.bdev_stat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:52.403 13:19:33 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:10:52.403 ************************************ 00:10:52.403 END TEST bdev_stat 00:10:52.403 ************************************ 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@793 -- # [[ bdev == gpt ]] 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@797 -- # [[ bdev == crypto_sw ]] 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@810 -- # cleanup 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:10:52.403 13:19:33 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:10:52.403 00:10:52.403 real 1m51.247s 00:10:52.403 user 7m13.902s 00:10:52.403 sys 0m17.539s 00:10:52.403 13:19:33 blockdev_general -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:52.403 13:19:33 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:10:52.403 ************************************ 00:10:52.403 END TEST blockdev_general 00:10:52.403 ************************************ 00:10:52.403 13:19:33 -- spdk/autotest.sh@194 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:52.403 13:19:33 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:10:52.403 13:19:33 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:52.403 13:19:33 -- common/autotest_common.sh@10 -- # set +x 00:10:52.664 ************************************ 00:10:52.664 START TEST bdev_raid 00:10:52.664 ************************************ 00:10:52.664 13:19:33 bdev_raid -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:10:52.664 * Looking for test storage... 00:10:52.664 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:10:52.664 13:19:33 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:10:52.664 13:19:33 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:10:52.664 13:19:33 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:10:52.664 13:19:33 bdev_raid -- bdev/bdev_raid.sh@927 -- # mkdir -p /raidtest 00:10:52.664 13:19:33 bdev_raid -- bdev/bdev_raid.sh@928 -- # trap 'cleanup; exit 1' EXIT 00:10:52.664 13:19:33 bdev_raid -- bdev/bdev_raid.sh@930 -- # base_blocklen=512 00:10:52.664 13:19:33 bdev_raid -- bdev/bdev_raid.sh@932 -- # run_test raid0_resize_superblock_test raid_resize_superblock_test 0 00:10:52.664 13:19:33 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:52.664 13:19:33 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:52.664 13:19:33 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:52.664 ************************************ 00:10:52.664 START TEST raid0_resize_superblock_test 00:10:52.664 ************************************ 00:10:52.664 13:19:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 0 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=0 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=866755 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 866755' 00:10:52.665 Process raid pid: 866755 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 866755 /var/tmp/spdk-raid.sock 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 866755 ']' 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:52.665 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:52.665 13:19:33 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:52.926 [2024-07-25 13:19:33.458252] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:52.926 [2024-07-25 13:19:33.458381] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:52.926 [2024-07-25 13:19:33.602668] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:52.926 [2024-07-25 13:19:33.695979] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:53.187 [2024-07-25 13:19:33.749580] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:53.187 [2024-07-25 13:19:33.749612] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:53.759 13:19:34 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:53.759 13:19:34 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:10:53.759 13:19:34 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:10:54.020 malloc0 00:10:54.020 13:19:34 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:10:54.280 [2024-07-25 13:19:34.853923] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:10:54.280 [2024-07-25 13:19:34.853984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:54.280 [2024-07-25 13:19:34.854003] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25563f0 00:10:54.280 [2024-07-25 13:19:34.854010] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:54.280 [2024-07-25 13:19:34.855577] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:54.280 [2024-07-25 13:19:34.855614] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:10:54.280 pt0 00:10:54.280 13:19:34 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:10:54.541 79b6b4a5-b7c6-4ff3-a507-c361e9f56d2a 00:10:54.541 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:10:54.802 dbb5168f-2800-4522-b608-91d6973f4688 00:10:54.802 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:10:54.802 bebe2f41-a57b-46c8-9668-06fb618a749e 00:10:54.802 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:10:54.802 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@884 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 0 -z 64 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:10:55.063 [2024-07-25 13:19:35.751954] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev dbb5168f-2800-4522-b608-91d6973f4688 is claimed 00:10:55.063 [2024-07-25 13:19:35.752043] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev bebe2f41-a57b-46c8-9668-06fb618a749e is claimed 00:10:55.063 [2024-07-25 13:19:35.752152] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x25eac90 00:10:55.063 [2024-07-25 13:19:35.752159] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 245760, blocklen 512 00:10:55.063 [2024-07-25 13:19:35.752344] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ea910 00:10:55.063 [2024-07-25 13:19:35.752483] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25eac90 00:10:55.063 [2024-07-25 13:19:35.752488] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x25eac90 00:10:55.063 [2024-07-25 13:19:35.752606] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:55.063 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:10:55.063 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:10:55.324 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:10:55.324 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:10:55.324 13:19:35 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:10:55.585 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:10:55.585 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:10:55.585 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:55.585 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:10:55.585 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # jq '.[].num_blocks' 00:10:56.157 [2024-07-25 13:19:36.678457] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:56.157 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:10:56.157 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:10:56.157 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@894 -- # (( 245760 == 245760 )) 00:10:56.157 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:10:56.157 [2024-07-25 13:19:36.886923] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:56.157 [2024-07-25 13:19:36.886937] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'dbb5168f-2800-4522-b608-91d6973f4688' was resized: old size 131072, new size 204800 00:10:56.157 13:19:36 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:10:56.417 [2024-07-25 13:19:37.079361] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:10:56.417 [2024-07-25 13:19:37.079375] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'bebe2f41-a57b-46c8-9668-06fb618a749e' was resized: old size 131072, new size 204800 00:10:56.417 [2024-07-25 13:19:37.079389] bdev_raid.c:2331:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 245760 to 393216 00:10:56.417 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:10:56.417 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:10:56.677 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:10:56.678 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:10:56.678 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:10:56.678 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:10:56.678 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:10:56.678 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # jq '.[].num_blocks' 00:10:56.678 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:10:56.678 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:56.962 [2024-07-25 13:19:37.640891] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:56.962 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:10:56.962 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:10:56.962 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@908 -- # (( 393216 == 393216 )) 00:10:56.962 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:10:57.256 [2024-07-25 13:19:37.817160] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:10:57.256 [2024-07-25 13:19:37.817204] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:10:57.256 [2024-07-25 13:19:37.817210] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:57.256 [2024-07-25 13:19:37.817217] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:10:57.256 [2024-07-25 13:19:37.817284] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:57.256 [2024-07-25 13:19:37.817308] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:57.256 [2024-07-25 13:19:37.817314] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25eac90 name Raid, state offline 00:10:57.256 13:19:37 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:10:57.256 [2024-07-25 13:19:38.005614] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:10:57.256 [2024-07-25 13:19:38.005643] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:57.256 [2024-07-25 13:19:38.005655] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f9690 00:10:57.256 [2024-07-25 13:19:38.005662] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:57.256 [2024-07-25 13:19:38.006922] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:57.256 [2024-07-25 13:19:38.006943] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:10:57.256 [2024-07-25 13:19:38.007928] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev dbb5168f-2800-4522-b608-91d6973f4688 00:10:57.256 [2024-07-25 13:19:38.007954] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev dbb5168f-2800-4522-b608-91d6973f4688 is claimed 00:10:57.256 [2024-07-25 13:19:38.008021] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev bebe2f41-a57b-46c8-9668-06fb618a749e 00:10:57.256 [2024-07-25 13:19:38.008033] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev bebe2f41-a57b-46c8-9668-06fb618a749e is claimed 00:10:57.256 [2024-07-25 13:19:38.008115] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev bebe2f41-a57b-46c8-9668-06fb618a749e (2) smaller than existing raid bdev Raid (3) 00:10:57.256 [2024-07-25 13:19:38.008138] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x26ee510 00:10:57.256 [2024-07-25 13:19:38.008142] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 393216, blocklen 512 00:10:57.256 [2024-07-25 13:19:38.008277] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25ec160 00:10:57.256 [2024-07-25 13:19:38.008387] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26ee510 00:10:57.256 [2024-07-25 13:19:38.008393] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x26ee510 00:10:57.256 [2024-07-25 13:19:38.008476] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:57.256 pt0 00:10:57.256 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:10:57.256 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # jq '.[].num_blocks' 00:10:57.256 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:10:57.256 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:10:57.522 [2024-07-25 13:19:38.198306] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@918 -- # (( 393216 == 393216 )) 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 866755 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 866755 ']' 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 866755 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 866755 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 866755' 00:10:57.522 killing process with pid 866755 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 866755 00:10:57.522 [2024-07-25 13:19:38.258236] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:57.522 [2024-07-25 13:19:38.258277] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:57.522 [2024-07-25 13:19:38.258302] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:57.522 [2024-07-25 13:19:38.258307] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26ee510 name Raid, state offline 00:10:57.522 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 866755 00:10:57.522 [2024-07-25 13:19:38.303929] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:57.783 13:19:38 bdev_raid.raid0_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:10:57.783 00:10:57.783 real 0m5.062s 00:10:57.783 user 0m8.449s 00:10:57.783 sys 0m0.949s 00:10:57.783 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:10:57.783 13:19:38 bdev_raid.raid0_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.783 ************************************ 00:10:57.783 END TEST raid0_resize_superblock_test 00:10:57.783 ************************************ 00:10:57.783 13:19:38 bdev_raid -- bdev/bdev_raid.sh@933 -- # run_test raid1_resize_superblock_test raid_resize_superblock_test 1 00:10:57.783 13:19:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:10:57.783 13:19:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:10:57.783 13:19:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:57.783 ************************************ 00:10:57.783 START TEST raid1_resize_superblock_test 00:10:57.783 ************************************ 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1125 -- # raid_resize_superblock_test 1 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@868 -- # local raid_level=1 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@871 -- # raid_pid=867726 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@872 -- # echo 'Process raid pid: 867726' 00:10:57.783 Process raid pid: 867726 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@873 -- # waitforlisten 867726 /var/tmp/spdk-raid.sock 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@870 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 867726 ']' 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:57.783 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:10:57.783 13:19:38 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.783 [2024-07-25 13:19:38.549371] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:10:57.783 [2024-07-25 13:19:38.549416] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:58.043 [2024-07-25 13:19:38.638903] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:58.043 [2024-07-25 13:19:38.701464] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:10:58.043 [2024-07-25 13:19:38.742260] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.043 [2024-07-25 13:19:38.742282] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.613 13:19:39 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:10:58.613 13:19:39 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:10:58.613 13:19:39 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@875 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create -b malloc0 512 512 00:10:58.873 malloc0 00:10:59.133 13:19:39 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@877 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:10:59.133 [2024-07-25 13:19:39.846030] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:10:59.133 [2024-07-25 13:19:39.846064] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.133 [2024-07-25 13:19:39.846079] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x271e3f0 00:10:59.133 [2024-07-25 13:19:39.846086] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.133 [2024-07-25 13:19:39.847391] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.133 [2024-07-25 13:19:39.847410] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:10:59.133 pt0 00:10:59.133 13:19:39 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@878 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create_lvstore pt0 lvs0 00:10:59.392 08b94616-aa06-4739-a64a-7e37471dc53e 00:10:59.392 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@880 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol0 64 00:10:59.651 9d8f3cbd-285d-447a-b382-1d2a6b6b7f03 00:10:59.651 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@881 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_create -l lvs0 lvol1 64 00:10:59.911 f16f290a-c7f7-40b9-bb95-a91c02df7189 00:10:59.911 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@883 -- # case $raid_level in 00:10:59.911 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@885 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -n Raid -r 1 -b 'lvs0/lvol0 lvs0/lvol1' -s 00:10:59.911 [2024-07-25 13:19:40.666552] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev 9d8f3cbd-285d-447a-b382-1d2a6b6b7f03 is claimed 00:10:59.911 [2024-07-25 13:19:40.666617] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev f16f290a-c7f7-40b9-bb95-a91c02df7189 is claimed 00:10:59.911 [2024-07-25 13:19:40.666717] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x27b2c90 00:10:59.911 [2024-07-25 13:19:40.666724] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 122880, blocklen 512 00:10:59.911 [2024-07-25 13:19:40.666879] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b7900 00:10:59.911 [2024-07-25 13:19:40.667005] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27b2c90 00:10:59.911 [2024-07-25 13:19:40.667011] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x27b2c90 00:10:59.911 [2024-07-25 13:19:40.667101] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:59.911 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # jq '.[].num_blocks' 00:10:59.911 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:11:00.171 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@889 -- # (( 64 == 64 )) 00:11:00.171 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:11:00.171 13:19:40 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # jq '.[].num_blocks' 00:11:00.430 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@890 -- # (( 64 == 64 )) 00:11:00.430 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:00.430 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # jq '.[].num_blocks' 00:11:00.430 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:00.430 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:00.691 [2024-07-25 13:19:41.244164] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:00.691 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:00.691 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@893 -- # case $raid_level in 00:11:00.691 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@895 -- # (( 122880 == 122880 )) 00:11:00.691 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@899 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol0 100 00:11:00.691 [2024-07-25 13:19:41.420553] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:00.691 [2024-07-25 13:19:41.420566] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev '9d8f3cbd-285d-447a-b382-1d2a6b6b7f03' was resized: old size 131072, new size 204800 00:11:00.691 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@900 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_lvol_resize lvs0/lvol1 100 00:11:00.951 [2024-07-25 13:19:41.608995] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:00.951 [2024-07-25 13:19:41.609011] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'f16f290a-c7f7-40b9-bb95-a91c02df7189' was resized: old size 131072, new size 204800 00:11:00.951 [2024-07-25 13:19:41.609026] bdev_raid.c:2331:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 122880 to 196608 00:11:00.951 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol0 00:11:00.951 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # jq '.[].num_blocks' 00:11:01.212 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@903 -- # (( 100 == 100 )) 00:11:01.212 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b lvs0/lvol1 00:11:01.212 13:19:41 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # jq '.[].num_blocks' 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@904 -- # (( 100 == 100 )) 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # jq '.[].num_blocks' 00:11:01.472 [2024-07-25 13:19:42.186555] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@907 -- # case $raid_level in 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@909 -- # (( 196608 == 196608 )) 00:11:01.472 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@912 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt0 00:11:01.733 [2024-07-25 13:19:42.374846] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev pt0 being removed: closing lvstore lvs0 00:11:01.733 [2024-07-25 13:19:42.374891] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol0 00:11:01.733 [2024-07-25 13:19:42.374907] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: lvs0/lvol1 00:11:01.733 [2024-07-25 13:19:42.374997] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:01.733 [2024-07-25 13:19:42.375106] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:01.733 [2024-07-25 13:19:42.375152] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:01.733 [2024-07-25 13:19:42.375159] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27b2c90 name Raid, state offline 00:11:01.733 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@913 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc0 -p pt0 00:11:01.994 [2024-07-25 13:19:42.563298] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:11:01.994 [2024-07-25 13:19:42.563327] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:01.994 [2024-07-25 13:19:42.563340] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b5e30 00:11:01.994 [2024-07-25 13:19:42.563346] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:01.994 [2024-07-25 13:19:42.564617] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:01.994 [2024-07-25 13:19:42.564637] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:11:01.994 [2024-07-25 13:19:42.565632] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev 9d8f3cbd-285d-447a-b382-1d2a6b6b7f03 00:11:01.994 [2024-07-25 13:19:42.565660] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev 9d8f3cbd-285d-447a-b382-1d2a6b6b7f03 is claimed 00:11:01.994 [2024-07-25 13:19:42.565726] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev f16f290a-c7f7-40b9-bb95-a91c02df7189 00:11:01.994 [2024-07-25 13:19:42.565737] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev f16f290a-c7f7-40b9-bb95-a91c02df7189 is claimed 00:11:01.994 [2024-07-25 13:19:42.565823] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev f16f290a-c7f7-40b9-bb95-a91c02df7189 (2) smaller than existing raid bdev Raid (3) 00:11:01.994 [2024-07-25 13:19:42.565845] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x28b6510 00:11:01.994 [2024-07-25 13:19:42.565849] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:11:01.994 [2024-07-25 13:19:42.565981] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b6d20 00:11:01.994 [2024-07-25 13:19:42.566095] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28b6510 00:11:01.994 [2024-07-25 13:19:42.566101] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x28b6510 00:11:01.994 [2024-07-25 13:19:42.566184] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:01.994 pt0 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # jq '.[].num_blocks' 00:11:01.994 [2024-07-25 13:19:42.739946] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@917 -- # case $raid_level in 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@919 -- # (( 196608 == 196608 )) 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@922 -- # killprocess 867726 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 867726 ']' 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@954 -- # kill -0 867726 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # uname 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:01.994 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 867726 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 867726' 00:11:02.254 killing process with pid 867726 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@969 -- # kill 867726 00:11:02.254 [2024-07-25 13:19:42.828766] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:02.254 [2024-07-25 13:19:42.828799] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:02.254 [2024-07-25 13:19:42.828829] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:02.254 [2024-07-25 13:19:42.828834] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28b6510 name Raid, state offline 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@974 -- # wait 867726 00:11:02.254 [2024-07-25 13:19:42.874464] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- bdev/bdev_raid.sh@924 -- # return 0 00:11:02.254 00:11:02.254 real 0m4.497s 00:11:02.254 user 0m7.462s 00:11:02.254 sys 0m0.800s 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:02.254 13:19:42 bdev_raid.raid1_resize_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:02.254 ************************************ 00:11:02.254 END TEST raid1_resize_superblock_test 00:11:02.254 ************************************ 00:11:02.254 13:19:43 bdev_raid -- bdev/bdev_raid.sh@935 -- # uname -s 00:11:02.254 13:19:43 bdev_raid -- bdev/bdev_raid.sh@935 -- # '[' Linux = Linux ']' 00:11:02.254 13:19:43 bdev_raid -- bdev/bdev_raid.sh@935 -- # modprobe -n nbd 00:11:02.254 13:19:43 bdev_raid -- bdev/bdev_raid.sh@936 -- # has_nbd=true 00:11:02.254 13:19:43 bdev_raid -- bdev/bdev_raid.sh@937 -- # modprobe nbd 00:11:02.515 13:19:43 bdev_raid -- bdev/bdev_raid.sh@938 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:11:02.515 13:19:43 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:02.515 13:19:43 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:02.515 13:19:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:02.515 ************************************ 00:11:02.515 START TEST raid_function_test_raid0 00:11:02.515 ************************************ 00:11:02.515 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # raid_function_test raid0 00:11:02.515 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:11:02.515 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:02.515 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:02.515 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=868420 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 868420' 00:11:02.516 Process raid pid: 868420 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 868420 /var/tmp/spdk-raid.sock 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@831 -- # '[' -z 868420 ']' 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:02.516 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:02.516 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:02.516 [2024-07-25 13:19:43.146964] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:02.516 [2024-07-25 13:19:43.147010] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:02.516 [2024-07-25 13:19:43.238988] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:02.784 [2024-07-25 13:19:43.314723] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:02.784 [2024-07-25 13:19:43.364424] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.784 [2024-07-25 13:19:43.364450] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:02.784 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:02.784 13:19:43 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@864 -- # return 0 00:11:02.784 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:11:02.784 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:11:02.784 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:02.784 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:11:02.784 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:03.045 [2024-07-25 13:19:43.680122] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:03.045 [2024-07-25 13:19:43.681110] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:03.045 [2024-07-25 13:19:43.681152] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x10ea820 00:11:03.045 [2024-07-25 13:19:43.681158] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:03.045 [2024-07-25 13:19:43.681434] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x129eb30 00:11:03.045 [2024-07-25 13:19:43.681520] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10ea820 00:11:03.045 [2024-07-25 13:19:43.681525] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x10ea820 00:11:03.045 [2024-07-25 13:19:43.681607] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:03.045 Base_1 00:11:03.045 Base_2 00:11:03.045 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:03.045 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:03.045 13:19:43 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:03.615 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:03.875 [2024-07-25 13:19:44.494198] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x129eb30 00:11:03.875 /dev/nbd0 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@869 -- # local i 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@873 -- # break 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:03.875 1+0 records in 00:11:03.875 1+0 records out 00:11:03.875 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265829 s, 15.4 MB/s 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # size=4096 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@889 -- # return 0 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:03.875 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:04.136 { 00:11:04.136 "nbd_device": "/dev/nbd0", 00:11:04.136 "bdev_name": "raid" 00:11:04.136 } 00:11:04.136 ]' 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:04.136 { 00:11:04.136 "nbd_device": "/dev/nbd0", 00:11:04.136 "bdev_name": "raid" 00:11:04.136 } 00:11:04.136 ]' 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:04.136 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:04.137 4096+0 records in 00:11:04.137 4096+0 records out 00:11:04.137 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0276489 s, 75.8 MB/s 00:11:04.137 13:19:44 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:04.398 4096+0 records in 00:11:04.398 4096+0 records out 00:11:04.398 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.169106 s, 12.4 MB/s 00:11:04.398 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:04.398 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.398 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:04.399 128+0 records in 00:11:04.399 128+0 records out 00:11:04.399 65536 bytes (66 kB, 64 KiB) copied, 0.000362415 s, 181 MB/s 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:04.399 2035+0 records in 00:11:04.399 2035+0 records out 00:11:04.399 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00484021 s, 215 MB/s 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:04.399 456+0 records in 00:11:04.399 456+0 records out 00:11:04.399 233472 bytes (233 kB, 228 KiB) copied, 0.00119622 s, 195 MB/s 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:04.399 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:04.660 [2024-07-25 13:19:45.299342] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:04.660 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 868420 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@950 -- # '[' -z 868420 ']' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # kill -0 868420 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # uname 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 868420 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # echo 'killing process with pid 868420' 00:11:04.921 killing process with pid 868420 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@969 -- # kill 868420 00:11:04.921 [2024-07-25 13:19:45.618964] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:04.921 [2024-07-25 13:19:45.619016] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:04.921 [2024-07-25 13:19:45.619047] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:04.921 [2024-07-25 13:19:45.619056] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10ea820 name raid, state offline 00:11:04.921 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@974 -- # wait 868420 00:11:04.921 [2024-07-25 13:19:45.628374] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:05.183 13:19:45 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:11:05.183 00:11:05.183 real 0m2.656s 00:11:05.183 user 0m4.135s 00:11:05.183 sys 0m0.805s 00:11:05.183 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:05.183 13:19:45 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:11:05.183 ************************************ 00:11:05.183 END TEST raid_function_test_raid0 00:11:05.183 ************************************ 00:11:05.183 13:19:45 bdev_raid -- bdev/bdev_raid.sh@939 -- # run_test raid_function_test_concat raid_function_test concat 00:11:05.183 13:19:45 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:05.183 13:19:45 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:05.183 13:19:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:05.183 ************************************ 00:11:05.183 START TEST raid_function_test_concat 00:11:05.183 ************************************ 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # raid_function_test concat 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=869092 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 869092' 00:11:05.183 Process raid pid: 869092 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 869092 /var/tmp/spdk-raid.sock 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@831 -- # '[' -z 869092 ']' 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:05.183 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:05.183 13:19:45 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:05.183 [2024-07-25 13:19:45.920125] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:05.183 [2024-07-25 13:19:45.920254] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:05.443 [2024-07-25 13:19:46.066230] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:05.443 [2024-07-25 13:19:46.138778] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:05.443 [2024-07-25 13:19:46.187553] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:05.443 [2024-07-25 13:19:46.187577] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:06.013 13:19:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:06.013 13:19:46 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@864 -- # return 0 00:11:06.013 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:11:06.013 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:11:06.013 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:06.013 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:11:06.013 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:11:06.274 [2024-07-25 13:19:46.919976] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:06.274 [2024-07-25 13:19:46.920949] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:06.274 [2024-07-25 13:19:46.920989] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x16fa820 00:11:06.274 [2024-07-25 13:19:46.920994] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:06.274 [2024-07-25 13:19:46.921177] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16fe3e0 00:11:06.274 [2024-07-25 13:19:46.921261] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16fa820 00:11:06.274 [2024-07-25 13:19:46.921267] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x16fa820 00:11:06.274 [2024-07-25 13:19:46.921339] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:06.274 Base_1 00:11:06.274 Base_2 00:11:06.274 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:11:06.274 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:06.274 13:19:46 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:06.845 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:11:07.106 [2024-07-25 13:19:47.665864] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16fe310 00:11:07.106 /dev/nbd0 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@869 -- # local i 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@873 -- # break 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:11:07.106 1+0 records in 00:11:07.106 1+0 records out 00:11:07.106 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000253603 s, 16.2 MB/s 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # size=4096 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@889 -- # return 0 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:07.106 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:11:07.367 { 00:11:07.367 "nbd_device": "/dev/nbd0", 00:11:07.367 "bdev_name": "raid" 00:11:07.367 } 00:11:07.367 ]' 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:11:07.367 { 00:11:07.367 "nbd_device": "/dev/nbd0", 00:11:07.367 "bdev_name": "raid" 00:11:07.367 } 00:11:07.367 ]' 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:11:07.367 13:19:47 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:11:07.367 4096+0 records in 00:11:07.367 4096+0 records out 00:11:07.367 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0264362 s, 79.3 MB/s 00:11:07.367 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:11:07.628 4096+0 records in 00:11:07.628 4096+0 records out 00:11:07.628 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.174198 s, 12.0 MB/s 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:11:07.628 128+0 records in 00:11:07.628 128+0 records out 00:11:07.628 65536 bytes (66 kB, 64 KiB) copied, 0.000362422 s, 181 MB/s 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:11:07.628 2035+0 records in 00:11:07.628 2035+0 records out 00:11:07.628 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00479729 s, 217 MB/s 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:11:07.628 456+0 records in 00:11:07.628 456+0 records out 00:11:07.628 233472 bytes (233 kB, 228 KiB) copied, 0.0011275 s, 207 MB/s 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:11:07.628 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:11:07.889 [2024-07-25 13:19:48.467161] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:07.889 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:11:07.889 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:11:07.889 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:11:07.889 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:11:07.889 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:11:07.889 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:11:07.890 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 869092 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@950 -- # '[' -z 869092 ']' 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # kill -0 869092 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # uname 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 869092 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 869092' 00:11:08.151 killing process with pid 869092 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@969 -- # kill 869092 00:11:08.151 [2024-07-25 13:19:48.803252] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:08.151 [2024-07-25 13:19:48.803300] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:08.151 [2024-07-25 13:19:48.803331] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:08.151 [2024-07-25 13:19:48.803337] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16fa820 name raid, state offline 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@974 -- # wait 869092 00:11:08.151 [2024-07-25 13:19:48.812673] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:11:08.151 00:11:08.151 real 0m3.109s 00:11:08.151 user 0m4.494s 00:11:08.151 sys 0m0.872s 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:08.151 13:19:48 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:11:08.151 ************************************ 00:11:08.151 END TEST raid_function_test_concat 00:11:08.151 ************************************ 00:11:08.412 13:19:48 bdev_raid -- bdev/bdev_raid.sh@942 -- # run_test raid0_resize_test raid_resize_test 0 00:11:08.412 13:19:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:08.412 13:19:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:08.412 13:19:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:08.412 ************************************ 00:11:08.412 START TEST raid0_resize_test 00:11:08.412 ************************************ 00:11:08.412 13:19:49 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 0 00:11:08.412 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=0 00:11:08.412 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:11:08.412 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:11:08.412 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:11:08.412 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=869664 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 869664' 00:11:08.413 Process raid pid: 869664 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 869664 /var/tmp/spdk-raid.sock 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- common/autotest_common.sh@831 -- # '[' -z 869664 ']' 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:08.413 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:08.413 13:19:49 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.413 [2024-07-25 13:19:49.103391] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:08.413 [2024-07-25 13:19:49.103526] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:08.673 [2024-07-25 13:19:49.247509] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:08.673 [2024-07-25 13:19:49.325095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:08.673 [2024-07-25 13:19:49.371787] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:08.673 [2024-07-25 13:19:49.371808] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:09.615 13:19:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:09.615 13:19:50 bdev_raid.raid0_resize_test -- common/autotest_common.sh@864 -- # return 0 00:11:09.615 13:19:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:09.875 Base_1 00:11:09.875 13:19:50 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:10.445 Base_2 00:11:10.445 13:19:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 0 -eq 0 ']' 00:11:10.445 13:19:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:11:10.705 [2024-07-25 13:19:51.298229] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:10.706 [2024-07-25 13:19:51.299342] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:10.706 [2024-07-25 13:19:51.299378] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xfb99d0 00:11:10.706 [2024-07-25 13:19:51.299383] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:10.706 [2024-07-25 13:19:51.299543] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1152a00 00:11:10.706 [2024-07-25 13:19:51.299620] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xfb99d0 00:11:10.706 [2024-07-25 13:19:51.299625] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0xfb99d0 00:11:10.706 [2024-07-25 13:19:51.299703] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:10.706 13:19:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:11.276 [2024-07-25 13:19:51.823521] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:11.276 [2024-07-25 13:19:51.823534] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:11.276 true 00:11:11.276 13:19:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:11.276 13:19:51 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:11:11.846 [2024-07-25 13:19:52.365022] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:11.846 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=131072 00:11:11.846 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=64 00:11:11.846 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 0 -eq 0 ']' 00:11:11.846 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@377 -- # expected_size=64 00:11:11.846 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 64 '!=' 64 ']' 00:11:11.846 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:12.106 [2024-07-25 13:19:52.645592] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:12.106 [2024-07-25 13:19:52.645606] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:12.106 [2024-07-25 13:19:52.645625] bdev_raid.c:2331:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:11:12.106 true 00:11:12.106 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:12.106 13:19:52 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:11:12.676 [2024-07-25 13:19:53.175051] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=262144 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=128 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 0 -eq 0 ']' 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@393 -- # expected_size=128 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 128 '!=' 128 ']' 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 869664 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@950 -- # '[' -z 869664 ']' 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # kill -0 869664 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # uname 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 869664 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 869664' 00:11:12.676 killing process with pid 869664 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@969 -- # kill 869664 00:11:12.676 [2024-07-25 13:19:53.261330] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:12.676 [2024-07-25 13:19:53.261371] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:12.676 [2024-07-25 13:19:53.261401] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:12.676 [2024-07-25 13:19:53.261407] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xfb99d0 name Raid, state offline 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@974 -- # wait 869664 00:11:12.676 [2024-07-25 13:19:53.262310] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:11:12.676 00:11:12.676 real 0m4.368s 00:11:12.676 user 0m7.467s 00:11:12.676 sys 0m0.654s 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:12.676 13:19:53 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.676 ************************************ 00:11:12.676 END TEST raid0_resize_test 00:11:12.676 ************************************ 00:11:12.676 13:19:53 bdev_raid -- bdev/bdev_raid.sh@943 -- # run_test raid1_resize_test raid_resize_test 1 00:11:12.676 13:19:53 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:11:12.676 13:19:53 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:12.676 13:19:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:12.676 ************************************ 00:11:12.676 START TEST raid1_resize_test 00:11:12.676 ************************************ 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1125 -- # raid_resize_test 1 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@347 -- # local raid_level=1 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@348 -- # local blksize=512 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@349 -- # local bdev_size_mb=32 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@350 -- # local new_bdev_size_mb=64 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@351 -- # local blkcnt 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@352 -- # local raid_size_mb 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@353 -- # local new_raid_size_mb 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@354 -- # local expected_size 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@357 -- # raid_pid=870444 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@358 -- # echo 'Process raid pid: 870444' 00:11:12.676 Process raid pid: 870444 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@359 -- # waitforlisten 870444 /var/tmp/spdk-raid.sock 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@356 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- common/autotest_common.sh@831 -- # '[' -z 870444 ']' 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:12.676 13:19:53 bdev_raid.raid1_resize_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:12.677 13:19:53 bdev_raid.raid1_resize_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:12.677 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:12.677 13:19:53 bdev_raid.raid1_resize_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:12.677 13:19:53 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:12.936 [2024-07-25 13:19:53.501396] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:12.936 [2024-07-25 13:19:53.501447] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:12.936 [2024-07-25 13:19:53.592770] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:12.936 [2024-07-25 13:19:53.658808] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:12.936 [2024-07-25 13:19:53.698470] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:12.936 [2024-07-25 13:19:53.698492] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:14.316 13:19:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:14.316 13:19:54 bdev_raid.raid1_resize_test -- common/autotest_common.sh@864 -- # return 0 00:11:14.316 13:19:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@361 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:11:14.316 Base_1 00:11:14.316 13:19:54 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:11:14.886 Base_2 00:11:14.886 13:19:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@364 -- # '[' 1 -eq 0 ']' 00:11:14.886 13:19:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@367 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r 1 -b 'Base_1 Base_2' -n Raid 00:11:15.146 [2024-07-25 13:19:55.736666] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:11:15.146 [2024-07-25 13:19:55.737788] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:11:15.146 [2024-07-25 13:19:55.737826] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x25f49d0 00:11:15.146 [2024-07-25 13:19:55.737831] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:15.146 [2024-07-25 13:19:55.737997] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278db80 00:11:15.146 [2024-07-25 13:19:55.738067] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25f49d0 00:11:15.146 [2024-07-25 13:19:55.738072] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x25f49d0 00:11:15.146 [2024-07-25 13:19:55.738149] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:15.146 13:19:55 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@371 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:11:15.791 [2024-07-25 13:19:56.265978] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:15.791 [2024-07-25 13:19:56.265997] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:11:15.791 true 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # jq '.[].num_blocks' 00:11:15.791 [2024-07-25 13:19:56.474632] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@374 -- # blkcnt=65536 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@375 -- # raid_size_mb=32 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@376 -- # '[' 1 -eq 0 ']' 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@379 -- # expected_size=32 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 32 '!=' 32 ']' 00:11:15.791 13:19:56 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:11:16.372 [2024-07-25 13:19:57.003816] bdev_raid.c:2304:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:11:16.372 [2024-07-25 13:19:57.003829] bdev_raid.c:2317:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:11:16.372 [2024-07-25 13:19:57.003846] bdev_raid.c:2331:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 65536 to 131072 00:11:16.372 true 00:11:16.372 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:11:16.372 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # jq '.[].num_blocks' 00:11:16.942 [2024-07-25 13:19:57.549312] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@390 -- # blkcnt=131072 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@391 -- # raid_size_mb=64 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@392 -- # '[' 1 -eq 0 ']' 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@395 -- # expected_size=64 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@397 -- # '[' 64 '!=' 64 ']' 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@402 -- # killprocess 870444 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@950 -- # '[' -z 870444 ']' 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@954 -- # kill -0 870444 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # uname 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 870444 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 870444' 00:11:16.942 killing process with pid 870444 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@969 -- # kill 870444 00:11:16.942 [2024-07-25 13:19:57.637218] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:16.942 [2024-07-25 13:19:57.637258] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:16.942 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@974 -- # wait 870444 00:11:16.942 [2024-07-25 13:19:57.637533] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:16.942 [2024-07-25 13:19:57.637540] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25f49d0 name Raid, state offline 00:11:16.942 [2024-07-25 13:19:57.638187] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:17.203 13:19:57 bdev_raid.raid1_resize_test -- bdev/bdev_raid.sh@404 -- # return 0 00:11:17.203 00:11:17.203 real 0m4.301s 00:11:17.203 user 0m7.359s 00:11:17.203 sys 0m0.604s 00:11:17.203 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:17.203 13:19:57 bdev_raid.raid1_resize_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.203 ************************************ 00:11:17.203 END TEST raid1_resize_test 00:11:17.203 ************************************ 00:11:17.203 13:19:57 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:11:17.203 13:19:57 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:11:17.203 13:19:57 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:11:17.203 13:19:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:17.203 13:19:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:17.203 13:19:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:17.203 ************************************ 00:11:17.203 START TEST raid_state_function_test 00:11:17.203 ************************************ 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 false 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=871123 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 871123' 00:11:17.203 Process raid pid: 871123 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 871123 /var/tmp/spdk-raid.sock 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 871123 ']' 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:17.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:17.203 13:19:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:17.203 [2024-07-25 13:19:57.879289] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:17.203 [2024-07-25 13:19:57.879359] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:17.203 [2024-07-25 13:19:57.983044] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:17.463 [2024-07-25 13:19:58.051579] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:17.463 [2024-07-25 13:19:58.105873] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:17.463 [2024-07-25 13:19:58.105897] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:18.033 13:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:18.033 13:19:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:11:18.033 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:18.292 [2024-07-25 13:19:58.902009] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:18.292 [2024-07-25 13:19:58.902040] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:18.292 [2024-07-25 13:19:58.902047] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:18.292 [2024-07-25 13:19:58.902053] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.292 13:19:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:18.552 13:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:18.552 "name": "Existed_Raid", 00:11:18.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.552 "strip_size_kb": 64, 00:11:18.552 "state": "configuring", 00:11:18.552 "raid_level": "raid0", 00:11:18.552 "superblock": false, 00:11:18.552 "num_base_bdevs": 2, 00:11:18.552 "num_base_bdevs_discovered": 0, 00:11:18.552 "num_base_bdevs_operational": 2, 00:11:18.552 "base_bdevs_list": [ 00:11:18.552 { 00:11:18.552 "name": "BaseBdev1", 00:11:18.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.552 "is_configured": false, 00:11:18.552 "data_offset": 0, 00:11:18.552 "data_size": 0 00:11:18.552 }, 00:11:18.552 { 00:11:18.552 "name": "BaseBdev2", 00:11:18.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:18.552 "is_configured": false, 00:11:18.552 "data_offset": 0, 00:11:18.552 "data_size": 0 00:11:18.552 } 00:11:18.552 ] 00:11:18.552 }' 00:11:18.552 13:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:18.552 13:19:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:19.120 13:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:19.120 [2024-07-25 13:19:59.828266] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:19.121 [2024-07-25 13:19:59.828292] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11da6b0 name Existed_Raid, state configuring 00:11:19.121 13:19:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:19.381 [2024-07-25 13:20:00.028801] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:19.381 [2024-07-25 13:20:00.028827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:19.381 [2024-07-25 13:20:00.028833] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:19.381 [2024-07-25 13:20:00.028838] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:19.381 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:19.950 [2024-07-25 13:20:00.564539] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:19.950 BaseBdev1 00:11:19.950 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:19.950 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:19.950 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:19.950 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:19.950 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:19.950 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:19.950 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:20.211 [ 00:11:20.211 { 00:11:20.211 "name": "BaseBdev1", 00:11:20.211 "aliases": [ 00:11:20.211 "a372a1cb-6cb1-4c54-92e6-3becb3ce3252" 00:11:20.211 ], 00:11:20.211 "product_name": "Malloc disk", 00:11:20.211 "block_size": 512, 00:11:20.211 "num_blocks": 65536, 00:11:20.211 "uuid": "a372a1cb-6cb1-4c54-92e6-3becb3ce3252", 00:11:20.211 "assigned_rate_limits": { 00:11:20.211 "rw_ios_per_sec": 0, 00:11:20.211 "rw_mbytes_per_sec": 0, 00:11:20.211 "r_mbytes_per_sec": 0, 00:11:20.211 "w_mbytes_per_sec": 0 00:11:20.211 }, 00:11:20.211 "claimed": true, 00:11:20.211 "claim_type": "exclusive_write", 00:11:20.211 "zoned": false, 00:11:20.211 "supported_io_types": { 00:11:20.211 "read": true, 00:11:20.211 "write": true, 00:11:20.211 "unmap": true, 00:11:20.211 "flush": true, 00:11:20.211 "reset": true, 00:11:20.211 "nvme_admin": false, 00:11:20.211 "nvme_io": false, 00:11:20.211 "nvme_io_md": false, 00:11:20.211 "write_zeroes": true, 00:11:20.211 "zcopy": true, 00:11:20.211 "get_zone_info": false, 00:11:20.211 "zone_management": false, 00:11:20.211 "zone_append": false, 00:11:20.211 "compare": false, 00:11:20.211 "compare_and_write": false, 00:11:20.211 "abort": true, 00:11:20.211 "seek_hole": false, 00:11:20.211 "seek_data": false, 00:11:20.211 "copy": true, 00:11:20.211 "nvme_iov_md": false 00:11:20.211 }, 00:11:20.211 "memory_domains": [ 00:11:20.211 { 00:11:20.211 "dma_device_id": "system", 00:11:20.211 "dma_device_type": 1 00:11:20.211 }, 00:11:20.211 { 00:11:20.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.211 "dma_device_type": 2 00:11:20.211 } 00:11:20.211 ], 00:11:20.211 "driver_specific": {} 00:11:20.211 } 00:11:20.211 ] 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.211 13:20:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:20.471 13:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.471 "name": "Existed_Raid", 00:11:20.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.471 "strip_size_kb": 64, 00:11:20.471 "state": "configuring", 00:11:20.471 "raid_level": "raid0", 00:11:20.471 "superblock": false, 00:11:20.471 "num_base_bdevs": 2, 00:11:20.471 "num_base_bdevs_discovered": 1, 00:11:20.471 "num_base_bdevs_operational": 2, 00:11:20.471 "base_bdevs_list": [ 00:11:20.471 { 00:11:20.471 "name": "BaseBdev1", 00:11:20.471 "uuid": "a372a1cb-6cb1-4c54-92e6-3becb3ce3252", 00:11:20.471 "is_configured": true, 00:11:20.471 "data_offset": 0, 00:11:20.471 "data_size": 65536 00:11:20.471 }, 00:11:20.471 { 00:11:20.471 "name": "BaseBdev2", 00:11:20.471 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:20.471 "is_configured": false, 00:11:20.471 "data_offset": 0, 00:11:20.471 "data_size": 0 00:11:20.471 } 00:11:20.471 ] 00:11:20.471 }' 00:11:20.471 13:20:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.471 13:20:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:21.408 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:21.668 [2024-07-25 13:20:02.248801] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:21.668 [2024-07-25 13:20:02.248830] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11d9fa0 name Existed_Raid, state configuring 00:11:21.668 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:21.668 [2024-07-25 13:20:02.445318] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:21.668 [2024-07-25 13:20:02.446454] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:21.668 [2024-07-25 13:20:02.446478] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:21.928 "name": "Existed_Raid", 00:11:21.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.928 "strip_size_kb": 64, 00:11:21.928 "state": "configuring", 00:11:21.928 "raid_level": "raid0", 00:11:21.928 "superblock": false, 00:11:21.928 "num_base_bdevs": 2, 00:11:21.928 "num_base_bdevs_discovered": 1, 00:11:21.928 "num_base_bdevs_operational": 2, 00:11:21.928 "base_bdevs_list": [ 00:11:21.928 { 00:11:21.928 "name": "BaseBdev1", 00:11:21.928 "uuid": "a372a1cb-6cb1-4c54-92e6-3becb3ce3252", 00:11:21.928 "is_configured": true, 00:11:21.928 "data_offset": 0, 00:11:21.928 "data_size": 65536 00:11:21.928 }, 00:11:21.928 { 00:11:21.928 "name": "BaseBdev2", 00:11:21.928 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:21.928 "is_configured": false, 00:11:21.928 "data_offset": 0, 00:11:21.928 "data_size": 0 00:11:21.928 } 00:11:21.928 ] 00:11:21.928 }' 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:21.928 13:20:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:22.499 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:22.758 [2024-07-25 13:20:03.360413] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:22.758 [2024-07-25 13:20:03.360438] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x11dada0 00:11:22.758 [2024-07-25 13:20:03.360443] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:11:22.758 [2024-07-25 13:20:03.360596] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x137e870 00:11:22.758 [2024-07-25 13:20:03.360689] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11dada0 00:11:22.758 [2024-07-25 13:20:03.360695] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11dada0 00:11:22.758 [2024-07-25 13:20:03.360812] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:22.758 BaseBdev2 00:11:22.758 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:22.758 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:11:22.758 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:22.758 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:22.758 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:22.758 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:22.758 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:23.019 [ 00:11:23.019 { 00:11:23.019 "name": "BaseBdev2", 00:11:23.019 "aliases": [ 00:11:23.019 "f4066c8b-f00e-4a57-84d5-e4f6c96d6231" 00:11:23.019 ], 00:11:23.019 "product_name": "Malloc disk", 00:11:23.019 "block_size": 512, 00:11:23.019 "num_blocks": 65536, 00:11:23.019 "uuid": "f4066c8b-f00e-4a57-84d5-e4f6c96d6231", 00:11:23.019 "assigned_rate_limits": { 00:11:23.019 "rw_ios_per_sec": 0, 00:11:23.019 "rw_mbytes_per_sec": 0, 00:11:23.019 "r_mbytes_per_sec": 0, 00:11:23.019 "w_mbytes_per_sec": 0 00:11:23.019 }, 00:11:23.019 "claimed": true, 00:11:23.019 "claim_type": "exclusive_write", 00:11:23.019 "zoned": false, 00:11:23.019 "supported_io_types": { 00:11:23.019 "read": true, 00:11:23.019 "write": true, 00:11:23.019 "unmap": true, 00:11:23.019 "flush": true, 00:11:23.019 "reset": true, 00:11:23.019 "nvme_admin": false, 00:11:23.019 "nvme_io": false, 00:11:23.019 "nvme_io_md": false, 00:11:23.019 "write_zeroes": true, 00:11:23.019 "zcopy": true, 00:11:23.019 "get_zone_info": false, 00:11:23.019 "zone_management": false, 00:11:23.019 "zone_append": false, 00:11:23.019 "compare": false, 00:11:23.019 "compare_and_write": false, 00:11:23.019 "abort": true, 00:11:23.019 "seek_hole": false, 00:11:23.019 "seek_data": false, 00:11:23.019 "copy": true, 00:11:23.019 "nvme_iov_md": false 00:11:23.019 }, 00:11:23.019 "memory_domains": [ 00:11:23.019 { 00:11:23.019 "dma_device_id": "system", 00:11:23.019 "dma_device_type": 1 00:11:23.019 }, 00:11:23.019 { 00:11:23.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:23.019 "dma_device_type": 2 00:11:23.019 } 00:11:23.019 ], 00:11:23.019 "driver_specific": {} 00:11:23.019 } 00:11:23.019 ] 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.019 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:23.279 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:23.279 "name": "Existed_Raid", 00:11:23.279 "uuid": "050c93d8-b1d4-446b-9bd7-b87ff93a4c2d", 00:11:23.279 "strip_size_kb": 64, 00:11:23.279 "state": "online", 00:11:23.279 "raid_level": "raid0", 00:11:23.279 "superblock": false, 00:11:23.279 "num_base_bdevs": 2, 00:11:23.279 "num_base_bdevs_discovered": 2, 00:11:23.279 "num_base_bdevs_operational": 2, 00:11:23.279 "base_bdevs_list": [ 00:11:23.279 { 00:11:23.279 "name": "BaseBdev1", 00:11:23.279 "uuid": "a372a1cb-6cb1-4c54-92e6-3becb3ce3252", 00:11:23.279 "is_configured": true, 00:11:23.279 "data_offset": 0, 00:11:23.279 "data_size": 65536 00:11:23.279 }, 00:11:23.279 { 00:11:23.279 "name": "BaseBdev2", 00:11:23.279 "uuid": "f4066c8b-f00e-4a57-84d5-e4f6c96d6231", 00:11:23.280 "is_configured": true, 00:11:23.280 "data_offset": 0, 00:11:23.280 "data_size": 65536 00:11:23.280 } 00:11:23.280 ] 00:11:23.280 }' 00:11:23.280 13:20:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:23.280 13:20:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:23.850 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:24.110 [2024-07-25 13:20:04.643894] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:24.110 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:24.110 "name": "Existed_Raid", 00:11:24.110 "aliases": [ 00:11:24.110 "050c93d8-b1d4-446b-9bd7-b87ff93a4c2d" 00:11:24.110 ], 00:11:24.110 "product_name": "Raid Volume", 00:11:24.110 "block_size": 512, 00:11:24.110 "num_blocks": 131072, 00:11:24.110 "uuid": "050c93d8-b1d4-446b-9bd7-b87ff93a4c2d", 00:11:24.111 "assigned_rate_limits": { 00:11:24.111 "rw_ios_per_sec": 0, 00:11:24.111 "rw_mbytes_per_sec": 0, 00:11:24.111 "r_mbytes_per_sec": 0, 00:11:24.111 "w_mbytes_per_sec": 0 00:11:24.111 }, 00:11:24.111 "claimed": false, 00:11:24.111 "zoned": false, 00:11:24.111 "supported_io_types": { 00:11:24.111 "read": true, 00:11:24.111 "write": true, 00:11:24.111 "unmap": true, 00:11:24.111 "flush": true, 00:11:24.111 "reset": true, 00:11:24.111 "nvme_admin": false, 00:11:24.111 "nvme_io": false, 00:11:24.111 "nvme_io_md": false, 00:11:24.111 "write_zeroes": true, 00:11:24.111 "zcopy": false, 00:11:24.111 "get_zone_info": false, 00:11:24.111 "zone_management": false, 00:11:24.111 "zone_append": false, 00:11:24.111 "compare": false, 00:11:24.111 "compare_and_write": false, 00:11:24.111 "abort": false, 00:11:24.111 "seek_hole": false, 00:11:24.111 "seek_data": false, 00:11:24.111 "copy": false, 00:11:24.111 "nvme_iov_md": false 00:11:24.111 }, 00:11:24.111 "memory_domains": [ 00:11:24.111 { 00:11:24.111 "dma_device_id": "system", 00:11:24.111 "dma_device_type": 1 00:11:24.111 }, 00:11:24.111 { 00:11:24.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.111 "dma_device_type": 2 00:11:24.111 }, 00:11:24.111 { 00:11:24.111 "dma_device_id": "system", 00:11:24.111 "dma_device_type": 1 00:11:24.111 }, 00:11:24.111 { 00:11:24.111 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.111 "dma_device_type": 2 00:11:24.111 } 00:11:24.111 ], 00:11:24.111 "driver_specific": { 00:11:24.111 "raid": { 00:11:24.111 "uuid": "050c93d8-b1d4-446b-9bd7-b87ff93a4c2d", 00:11:24.111 "strip_size_kb": 64, 00:11:24.111 "state": "online", 00:11:24.111 "raid_level": "raid0", 00:11:24.111 "superblock": false, 00:11:24.111 "num_base_bdevs": 2, 00:11:24.111 "num_base_bdevs_discovered": 2, 00:11:24.111 "num_base_bdevs_operational": 2, 00:11:24.111 "base_bdevs_list": [ 00:11:24.111 { 00:11:24.111 "name": "BaseBdev1", 00:11:24.111 "uuid": "a372a1cb-6cb1-4c54-92e6-3becb3ce3252", 00:11:24.111 "is_configured": true, 00:11:24.111 "data_offset": 0, 00:11:24.111 "data_size": 65536 00:11:24.111 }, 00:11:24.111 { 00:11:24.111 "name": "BaseBdev2", 00:11:24.111 "uuid": "f4066c8b-f00e-4a57-84d5-e4f6c96d6231", 00:11:24.111 "is_configured": true, 00:11:24.111 "data_offset": 0, 00:11:24.111 "data_size": 65536 00:11:24.111 } 00:11:24.111 ] 00:11:24.111 } 00:11:24.111 } 00:11:24.111 }' 00:11:24.111 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:24.111 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:24.111 BaseBdev2' 00:11:24.111 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:24.111 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:24.111 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:24.372 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:24.372 "name": "BaseBdev1", 00:11:24.372 "aliases": [ 00:11:24.372 "a372a1cb-6cb1-4c54-92e6-3becb3ce3252" 00:11:24.372 ], 00:11:24.372 "product_name": "Malloc disk", 00:11:24.372 "block_size": 512, 00:11:24.372 "num_blocks": 65536, 00:11:24.372 "uuid": "a372a1cb-6cb1-4c54-92e6-3becb3ce3252", 00:11:24.372 "assigned_rate_limits": { 00:11:24.372 "rw_ios_per_sec": 0, 00:11:24.372 "rw_mbytes_per_sec": 0, 00:11:24.372 "r_mbytes_per_sec": 0, 00:11:24.372 "w_mbytes_per_sec": 0 00:11:24.372 }, 00:11:24.372 "claimed": true, 00:11:24.372 "claim_type": "exclusive_write", 00:11:24.372 "zoned": false, 00:11:24.372 "supported_io_types": { 00:11:24.372 "read": true, 00:11:24.372 "write": true, 00:11:24.372 "unmap": true, 00:11:24.372 "flush": true, 00:11:24.372 "reset": true, 00:11:24.372 "nvme_admin": false, 00:11:24.372 "nvme_io": false, 00:11:24.372 "nvme_io_md": false, 00:11:24.372 "write_zeroes": true, 00:11:24.372 "zcopy": true, 00:11:24.372 "get_zone_info": false, 00:11:24.372 "zone_management": false, 00:11:24.372 "zone_append": false, 00:11:24.372 "compare": false, 00:11:24.372 "compare_and_write": false, 00:11:24.372 "abort": true, 00:11:24.372 "seek_hole": false, 00:11:24.372 "seek_data": false, 00:11:24.372 "copy": true, 00:11:24.372 "nvme_iov_md": false 00:11:24.372 }, 00:11:24.372 "memory_domains": [ 00:11:24.372 { 00:11:24.372 "dma_device_id": "system", 00:11:24.372 "dma_device_type": 1 00:11:24.372 }, 00:11:24.372 { 00:11:24.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.372 "dma_device_type": 2 00:11:24.372 } 00:11:24.372 ], 00:11:24.372 "driver_specific": {} 00:11:24.372 }' 00:11:24.372 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.372 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.372 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:24.372 13:20:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.372 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.372 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:24.372 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:24.373 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:24.633 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:24.633 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:24.633 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:24.633 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:24.633 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:24.633 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:24.633 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:24.894 "name": "BaseBdev2", 00:11:24.894 "aliases": [ 00:11:24.894 "f4066c8b-f00e-4a57-84d5-e4f6c96d6231" 00:11:24.894 ], 00:11:24.894 "product_name": "Malloc disk", 00:11:24.894 "block_size": 512, 00:11:24.894 "num_blocks": 65536, 00:11:24.894 "uuid": "f4066c8b-f00e-4a57-84d5-e4f6c96d6231", 00:11:24.894 "assigned_rate_limits": { 00:11:24.894 "rw_ios_per_sec": 0, 00:11:24.894 "rw_mbytes_per_sec": 0, 00:11:24.894 "r_mbytes_per_sec": 0, 00:11:24.894 "w_mbytes_per_sec": 0 00:11:24.894 }, 00:11:24.894 "claimed": true, 00:11:24.894 "claim_type": "exclusive_write", 00:11:24.894 "zoned": false, 00:11:24.894 "supported_io_types": { 00:11:24.894 "read": true, 00:11:24.894 "write": true, 00:11:24.894 "unmap": true, 00:11:24.894 "flush": true, 00:11:24.894 "reset": true, 00:11:24.894 "nvme_admin": false, 00:11:24.894 "nvme_io": false, 00:11:24.894 "nvme_io_md": false, 00:11:24.894 "write_zeroes": true, 00:11:24.894 "zcopy": true, 00:11:24.894 "get_zone_info": false, 00:11:24.894 "zone_management": false, 00:11:24.894 "zone_append": false, 00:11:24.894 "compare": false, 00:11:24.894 "compare_and_write": false, 00:11:24.894 "abort": true, 00:11:24.894 "seek_hole": false, 00:11:24.894 "seek_data": false, 00:11:24.894 "copy": true, 00:11:24.894 "nvme_iov_md": false 00:11:24.894 }, 00:11:24.894 "memory_domains": [ 00:11:24.894 { 00:11:24.894 "dma_device_id": "system", 00:11:24.894 "dma_device_type": 1 00:11:24.894 }, 00:11:24.894 { 00:11:24.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:24.894 "dma_device_type": 2 00:11:24.894 } 00:11:24.894 ], 00:11:24.894 "driver_specific": {} 00:11:24.894 }' 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:24.894 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:25.154 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:25.154 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:25.154 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:25.154 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:25.154 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:25.154 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:25.415 [2024-07-25 13:20:05.975087] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:25.415 [2024-07-25 13:20:05.975107] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:25.415 [2024-07-25 13:20:05.975135] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.415 13:20:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:25.415 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:25.415 "name": "Existed_Raid", 00:11:25.415 "uuid": "050c93d8-b1d4-446b-9bd7-b87ff93a4c2d", 00:11:25.415 "strip_size_kb": 64, 00:11:25.415 "state": "offline", 00:11:25.415 "raid_level": "raid0", 00:11:25.415 "superblock": false, 00:11:25.415 "num_base_bdevs": 2, 00:11:25.415 "num_base_bdevs_discovered": 1, 00:11:25.415 "num_base_bdevs_operational": 1, 00:11:25.415 "base_bdevs_list": [ 00:11:25.415 { 00:11:25.415 "name": null, 00:11:25.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:25.415 "is_configured": false, 00:11:25.415 "data_offset": 0, 00:11:25.415 "data_size": 65536 00:11:25.415 }, 00:11:25.415 { 00:11:25.415 "name": "BaseBdev2", 00:11:25.415 "uuid": "f4066c8b-f00e-4a57-84d5-e4f6c96d6231", 00:11:25.415 "is_configured": true, 00:11:25.415 "data_offset": 0, 00:11:25.415 "data_size": 65536 00:11:25.415 } 00:11:25.415 ] 00:11:25.415 }' 00:11:25.415 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:25.415 13:20:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:25.985 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:25.985 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:25.985 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:25.985 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:26.246 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:26.246 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:26.246 13:20:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:26.246 [2024-07-25 13:20:07.021719] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:26.246 [2024-07-25 13:20:07.021757] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11dada0 name Existed_Raid, state offline 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 871123 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 871123 ']' 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 871123 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:11:26.506 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:26.507 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 871123 00:11:26.507 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:26.507 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:26.507 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 871123' 00:11:26.507 killing process with pid 871123 00:11:26.507 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 871123 00:11:26.507 [2024-07-25 13:20:07.285738] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:26.507 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 871123 00:11:26.507 [2024-07-25 13:20:07.286335] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:26.767 13:20:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:26.767 00:11:26.767 real 0m9.588s 00:11:26.767 user 0m17.405s 00:11:26.767 sys 0m1.449s 00:11:26.767 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:26.768 ************************************ 00:11:26.768 END TEST raid_state_function_test 00:11:26.768 ************************************ 00:11:26.768 13:20:07 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:11:26.768 13:20:07 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:26.768 13:20:07 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:26.768 13:20:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:26.768 ************************************ 00:11:26.768 START TEST raid_state_function_test_sb 00:11:26.768 ************************************ 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 2 true 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=873139 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 873139' 00:11:26.768 Process raid pid: 873139 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 873139 /var/tmp/spdk-raid.sock 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 873139 ']' 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:26.768 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:26.768 13:20:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:26.768 [2024-07-25 13:20:07.548651] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:26.768 [2024-07-25 13:20:07.548716] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:27.029 [2024-07-25 13:20:07.641056] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:27.029 [2024-07-25 13:20:07.716542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:27.029 [2024-07-25 13:20:07.756678] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:27.029 [2024-07-25 13:20:07.756703] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:27.971 [2024-07-25 13:20:08.584651] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:27.971 [2024-07-25 13:20:08.584679] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:27.971 [2024-07-25 13:20:08.584686] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:27.971 [2024-07-25 13:20:08.584692] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.971 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:28.231 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:28.231 "name": "Existed_Raid", 00:11:28.231 "uuid": "fb8c35c3-1b7d-44ef-a992-989f456baedd", 00:11:28.231 "strip_size_kb": 64, 00:11:28.231 "state": "configuring", 00:11:28.231 "raid_level": "raid0", 00:11:28.231 "superblock": true, 00:11:28.231 "num_base_bdevs": 2, 00:11:28.231 "num_base_bdevs_discovered": 0, 00:11:28.231 "num_base_bdevs_operational": 2, 00:11:28.231 "base_bdevs_list": [ 00:11:28.231 { 00:11:28.231 "name": "BaseBdev1", 00:11:28.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.231 "is_configured": false, 00:11:28.231 "data_offset": 0, 00:11:28.231 "data_size": 0 00:11:28.231 }, 00:11:28.231 { 00:11:28.231 "name": "BaseBdev2", 00:11:28.231 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:28.231 "is_configured": false, 00:11:28.231 "data_offset": 0, 00:11:28.231 "data_size": 0 00:11:28.231 } 00:11:28.231 ] 00:11:28.232 }' 00:11:28.232 13:20:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:28.232 13:20:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:28.802 13:20:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:28.802 [2024-07-25 13:20:09.514872] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:28.802 [2024-07-25 13:20:09.514892] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fff6b0 name Existed_Raid, state configuring 00:11:28.802 13:20:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:29.063 [2024-07-25 13:20:09.711390] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:29.063 [2024-07-25 13:20:09.711410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:29.063 [2024-07-25 13:20:09.711415] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:29.063 [2024-07-25 13:20:09.711421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:29.063 13:20:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:29.323 [2024-07-25 13:20:09.906562] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:29.323 BaseBdev1 00:11:29.323 13:20:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:29.323 13:20:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:29.323 13:20:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:29.323 13:20:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:29.323 13:20:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:29.323 13:20:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:29.323 13:20:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:29.585 [ 00:11:29.585 { 00:11:29.585 "name": "BaseBdev1", 00:11:29.585 "aliases": [ 00:11:29.585 "fd3e790f-26c4-4047-bb70-7c26e34b9056" 00:11:29.585 ], 00:11:29.585 "product_name": "Malloc disk", 00:11:29.585 "block_size": 512, 00:11:29.585 "num_blocks": 65536, 00:11:29.585 "uuid": "fd3e790f-26c4-4047-bb70-7c26e34b9056", 00:11:29.585 "assigned_rate_limits": { 00:11:29.585 "rw_ios_per_sec": 0, 00:11:29.585 "rw_mbytes_per_sec": 0, 00:11:29.585 "r_mbytes_per_sec": 0, 00:11:29.585 "w_mbytes_per_sec": 0 00:11:29.585 }, 00:11:29.585 "claimed": true, 00:11:29.585 "claim_type": "exclusive_write", 00:11:29.585 "zoned": false, 00:11:29.585 "supported_io_types": { 00:11:29.585 "read": true, 00:11:29.585 "write": true, 00:11:29.585 "unmap": true, 00:11:29.585 "flush": true, 00:11:29.585 "reset": true, 00:11:29.585 "nvme_admin": false, 00:11:29.585 "nvme_io": false, 00:11:29.585 "nvme_io_md": false, 00:11:29.585 "write_zeroes": true, 00:11:29.585 "zcopy": true, 00:11:29.585 "get_zone_info": false, 00:11:29.585 "zone_management": false, 00:11:29.585 "zone_append": false, 00:11:29.585 "compare": false, 00:11:29.585 "compare_and_write": false, 00:11:29.585 "abort": true, 00:11:29.585 "seek_hole": false, 00:11:29.585 "seek_data": false, 00:11:29.585 "copy": true, 00:11:29.585 "nvme_iov_md": false 00:11:29.585 }, 00:11:29.585 "memory_domains": [ 00:11:29.585 { 00:11:29.585 "dma_device_id": "system", 00:11:29.585 "dma_device_type": 1 00:11:29.585 }, 00:11:29.585 { 00:11:29.585 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.585 "dma_device_type": 2 00:11:29.585 } 00:11:29.585 ], 00:11:29.585 "driver_specific": {} 00:11:29.585 } 00:11:29.585 ] 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:29.585 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:29.586 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:29.586 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:29.586 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.586 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:29.847 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:29.847 "name": "Existed_Raid", 00:11:29.847 "uuid": "31e3f35b-ee42-486d-8f35-8701fbfcb35a", 00:11:29.847 "strip_size_kb": 64, 00:11:29.847 "state": "configuring", 00:11:29.847 "raid_level": "raid0", 00:11:29.847 "superblock": true, 00:11:29.847 "num_base_bdevs": 2, 00:11:29.847 "num_base_bdevs_discovered": 1, 00:11:29.847 "num_base_bdevs_operational": 2, 00:11:29.847 "base_bdevs_list": [ 00:11:29.847 { 00:11:29.847 "name": "BaseBdev1", 00:11:29.847 "uuid": "fd3e790f-26c4-4047-bb70-7c26e34b9056", 00:11:29.847 "is_configured": true, 00:11:29.847 "data_offset": 2048, 00:11:29.847 "data_size": 63488 00:11:29.847 }, 00:11:29.847 { 00:11:29.847 "name": "BaseBdev2", 00:11:29.847 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:29.847 "is_configured": false, 00:11:29.847 "data_offset": 0, 00:11:29.847 "data_size": 0 00:11:29.847 } 00:11:29.847 ] 00:11:29.847 }' 00:11:29.847 13:20:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:29.847 13:20:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:30.418 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:30.678 [2024-07-25 13:20:11.245942] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:30.678 [2024-07-25 13:20:11.245967] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ffefa0 name Existed_Raid, state configuring 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:30.678 [2024-07-25 13:20:11.442469] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:30.678 [2024-07-25 13:20:11.443610] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:30.678 [2024-07-25 13:20:11.443633] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:30.678 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:30.938 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:30.938 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:30.938 "name": "Existed_Raid", 00:11:30.938 "uuid": "e911d66c-9c19-4775-885b-bd1f559c3770", 00:11:30.938 "strip_size_kb": 64, 00:11:30.938 "state": "configuring", 00:11:30.938 "raid_level": "raid0", 00:11:30.938 "superblock": true, 00:11:30.938 "num_base_bdevs": 2, 00:11:30.938 "num_base_bdevs_discovered": 1, 00:11:30.938 "num_base_bdevs_operational": 2, 00:11:30.938 "base_bdevs_list": [ 00:11:30.938 { 00:11:30.938 "name": "BaseBdev1", 00:11:30.938 "uuid": "fd3e790f-26c4-4047-bb70-7c26e34b9056", 00:11:30.938 "is_configured": true, 00:11:30.938 "data_offset": 2048, 00:11:30.938 "data_size": 63488 00:11:30.938 }, 00:11:30.938 { 00:11:30.938 "name": "BaseBdev2", 00:11:30.938 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:30.938 "is_configured": false, 00:11:30.938 "data_offset": 0, 00:11:30.938 "data_size": 0 00:11:30.938 } 00:11:30.938 ] 00:11:30.938 }' 00:11:30.938 13:20:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:30.938 13:20:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:31.509 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:31.769 [2024-07-25 13:20:12.317448] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:31.769 [2024-07-25 13:20:12.317564] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fffda0 00:11:31.769 [2024-07-25 13:20:12.317572] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:31.769 [2024-07-25 13:20:12.317713] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ffe980 00:11:31.769 [2024-07-25 13:20:12.317801] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fffda0 00:11:31.769 [2024-07-25 13:20:12.317806] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fffda0 00:11:31.769 [2024-07-25 13:20:12.317873] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:31.769 BaseBdev2 00:11:31.769 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:31.769 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:11:31.769 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:31.769 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:11:31.769 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:31.769 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:31.769 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:31.770 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:32.030 [ 00:11:32.030 { 00:11:32.030 "name": "BaseBdev2", 00:11:32.030 "aliases": [ 00:11:32.030 "4cab0c7a-28c2-4d96-82c9-9e885feff6d6" 00:11:32.030 ], 00:11:32.030 "product_name": "Malloc disk", 00:11:32.030 "block_size": 512, 00:11:32.030 "num_blocks": 65536, 00:11:32.030 "uuid": "4cab0c7a-28c2-4d96-82c9-9e885feff6d6", 00:11:32.030 "assigned_rate_limits": { 00:11:32.030 "rw_ios_per_sec": 0, 00:11:32.030 "rw_mbytes_per_sec": 0, 00:11:32.030 "r_mbytes_per_sec": 0, 00:11:32.030 "w_mbytes_per_sec": 0 00:11:32.030 }, 00:11:32.030 "claimed": true, 00:11:32.030 "claim_type": "exclusive_write", 00:11:32.030 "zoned": false, 00:11:32.030 "supported_io_types": { 00:11:32.030 "read": true, 00:11:32.030 "write": true, 00:11:32.030 "unmap": true, 00:11:32.030 "flush": true, 00:11:32.030 "reset": true, 00:11:32.030 "nvme_admin": false, 00:11:32.030 "nvme_io": false, 00:11:32.030 "nvme_io_md": false, 00:11:32.030 "write_zeroes": true, 00:11:32.030 "zcopy": true, 00:11:32.030 "get_zone_info": false, 00:11:32.030 "zone_management": false, 00:11:32.030 "zone_append": false, 00:11:32.030 "compare": false, 00:11:32.030 "compare_and_write": false, 00:11:32.030 "abort": true, 00:11:32.030 "seek_hole": false, 00:11:32.030 "seek_data": false, 00:11:32.030 "copy": true, 00:11:32.030 "nvme_iov_md": false 00:11:32.030 }, 00:11:32.030 "memory_domains": [ 00:11:32.030 { 00:11:32.030 "dma_device_id": "system", 00:11:32.030 "dma_device_type": 1 00:11:32.030 }, 00:11:32.030 { 00:11:32.030 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.030 "dma_device_type": 2 00:11:32.030 } 00:11:32.030 ], 00:11:32.030 "driver_specific": {} 00:11:32.030 } 00:11:32.030 ] 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.030 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:32.291 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.291 "name": "Existed_Raid", 00:11:32.291 "uuid": "e911d66c-9c19-4775-885b-bd1f559c3770", 00:11:32.291 "strip_size_kb": 64, 00:11:32.291 "state": "online", 00:11:32.291 "raid_level": "raid0", 00:11:32.291 "superblock": true, 00:11:32.291 "num_base_bdevs": 2, 00:11:32.291 "num_base_bdevs_discovered": 2, 00:11:32.291 "num_base_bdevs_operational": 2, 00:11:32.291 "base_bdevs_list": [ 00:11:32.291 { 00:11:32.291 "name": "BaseBdev1", 00:11:32.291 "uuid": "fd3e790f-26c4-4047-bb70-7c26e34b9056", 00:11:32.291 "is_configured": true, 00:11:32.291 "data_offset": 2048, 00:11:32.291 "data_size": 63488 00:11:32.291 }, 00:11:32.291 { 00:11:32.291 "name": "BaseBdev2", 00:11:32.291 "uuid": "4cab0c7a-28c2-4d96-82c9-9e885feff6d6", 00:11:32.291 "is_configured": true, 00:11:32.291 "data_offset": 2048, 00:11:32.291 "data_size": 63488 00:11:32.291 } 00:11:32.292 ] 00:11:32.292 }' 00:11:32.292 13:20:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.292 13:20:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:32.862 [2024-07-25 13:20:13.612941] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:32.862 "name": "Existed_Raid", 00:11:32.862 "aliases": [ 00:11:32.862 "e911d66c-9c19-4775-885b-bd1f559c3770" 00:11:32.862 ], 00:11:32.862 "product_name": "Raid Volume", 00:11:32.862 "block_size": 512, 00:11:32.862 "num_blocks": 126976, 00:11:32.862 "uuid": "e911d66c-9c19-4775-885b-bd1f559c3770", 00:11:32.862 "assigned_rate_limits": { 00:11:32.862 "rw_ios_per_sec": 0, 00:11:32.862 "rw_mbytes_per_sec": 0, 00:11:32.862 "r_mbytes_per_sec": 0, 00:11:32.862 "w_mbytes_per_sec": 0 00:11:32.862 }, 00:11:32.862 "claimed": false, 00:11:32.862 "zoned": false, 00:11:32.862 "supported_io_types": { 00:11:32.862 "read": true, 00:11:32.862 "write": true, 00:11:32.862 "unmap": true, 00:11:32.862 "flush": true, 00:11:32.862 "reset": true, 00:11:32.862 "nvme_admin": false, 00:11:32.862 "nvme_io": false, 00:11:32.862 "nvme_io_md": false, 00:11:32.862 "write_zeroes": true, 00:11:32.862 "zcopy": false, 00:11:32.862 "get_zone_info": false, 00:11:32.862 "zone_management": false, 00:11:32.862 "zone_append": false, 00:11:32.862 "compare": false, 00:11:32.862 "compare_and_write": false, 00:11:32.862 "abort": false, 00:11:32.862 "seek_hole": false, 00:11:32.862 "seek_data": false, 00:11:32.862 "copy": false, 00:11:32.862 "nvme_iov_md": false 00:11:32.862 }, 00:11:32.862 "memory_domains": [ 00:11:32.862 { 00:11:32.862 "dma_device_id": "system", 00:11:32.862 "dma_device_type": 1 00:11:32.862 }, 00:11:32.862 { 00:11:32.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.862 "dma_device_type": 2 00:11:32.862 }, 00:11:32.862 { 00:11:32.862 "dma_device_id": "system", 00:11:32.862 "dma_device_type": 1 00:11:32.862 }, 00:11:32.862 { 00:11:32.862 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:32.862 "dma_device_type": 2 00:11:32.862 } 00:11:32.862 ], 00:11:32.862 "driver_specific": { 00:11:32.862 "raid": { 00:11:32.862 "uuid": "e911d66c-9c19-4775-885b-bd1f559c3770", 00:11:32.862 "strip_size_kb": 64, 00:11:32.862 "state": "online", 00:11:32.862 "raid_level": "raid0", 00:11:32.862 "superblock": true, 00:11:32.862 "num_base_bdevs": 2, 00:11:32.862 "num_base_bdevs_discovered": 2, 00:11:32.862 "num_base_bdevs_operational": 2, 00:11:32.862 "base_bdevs_list": [ 00:11:32.862 { 00:11:32.862 "name": "BaseBdev1", 00:11:32.862 "uuid": "fd3e790f-26c4-4047-bb70-7c26e34b9056", 00:11:32.862 "is_configured": true, 00:11:32.862 "data_offset": 2048, 00:11:32.862 "data_size": 63488 00:11:32.862 }, 00:11:32.862 { 00:11:32.862 "name": "BaseBdev2", 00:11:32.862 "uuid": "4cab0c7a-28c2-4d96-82c9-9e885feff6d6", 00:11:32.862 "is_configured": true, 00:11:32.862 "data_offset": 2048, 00:11:32.862 "data_size": 63488 00:11:32.862 } 00:11:32.862 ] 00:11:32.862 } 00:11:32.862 } 00:11:32.862 }' 00:11:32.862 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:33.123 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:33.123 BaseBdev2' 00:11:33.123 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.123 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:33.123 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:33.123 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:33.123 "name": "BaseBdev1", 00:11:33.123 "aliases": [ 00:11:33.123 "fd3e790f-26c4-4047-bb70-7c26e34b9056" 00:11:33.123 ], 00:11:33.123 "product_name": "Malloc disk", 00:11:33.123 "block_size": 512, 00:11:33.123 "num_blocks": 65536, 00:11:33.123 "uuid": "fd3e790f-26c4-4047-bb70-7c26e34b9056", 00:11:33.123 "assigned_rate_limits": { 00:11:33.123 "rw_ios_per_sec": 0, 00:11:33.123 "rw_mbytes_per_sec": 0, 00:11:33.123 "r_mbytes_per_sec": 0, 00:11:33.123 "w_mbytes_per_sec": 0 00:11:33.123 }, 00:11:33.123 "claimed": true, 00:11:33.123 "claim_type": "exclusive_write", 00:11:33.123 "zoned": false, 00:11:33.123 "supported_io_types": { 00:11:33.123 "read": true, 00:11:33.123 "write": true, 00:11:33.123 "unmap": true, 00:11:33.123 "flush": true, 00:11:33.123 "reset": true, 00:11:33.123 "nvme_admin": false, 00:11:33.123 "nvme_io": false, 00:11:33.123 "nvme_io_md": false, 00:11:33.123 "write_zeroes": true, 00:11:33.123 "zcopy": true, 00:11:33.123 "get_zone_info": false, 00:11:33.123 "zone_management": false, 00:11:33.123 "zone_append": false, 00:11:33.123 "compare": false, 00:11:33.123 "compare_and_write": false, 00:11:33.123 "abort": true, 00:11:33.123 "seek_hole": false, 00:11:33.123 "seek_data": false, 00:11:33.123 "copy": true, 00:11:33.123 "nvme_iov_md": false 00:11:33.123 }, 00:11:33.123 "memory_domains": [ 00:11:33.123 { 00:11:33.123 "dma_device_id": "system", 00:11:33.123 "dma_device_type": 1 00:11:33.123 }, 00:11:33.123 { 00:11:33.123 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.123 "dma_device_type": 2 00:11:33.123 } 00:11:33.123 ], 00:11:33.123 "driver_specific": {} 00:11:33.123 }' 00:11:33.123 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.384 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.384 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:33.384 13:20:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.384 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.384 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:33.384 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.384 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.384 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.384 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.645 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:33.645 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:33.645 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.645 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:33.645 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:33.645 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:33.645 "name": "BaseBdev2", 00:11:33.645 "aliases": [ 00:11:33.645 "4cab0c7a-28c2-4d96-82c9-9e885feff6d6" 00:11:33.645 ], 00:11:33.645 "product_name": "Malloc disk", 00:11:33.645 "block_size": 512, 00:11:33.645 "num_blocks": 65536, 00:11:33.645 "uuid": "4cab0c7a-28c2-4d96-82c9-9e885feff6d6", 00:11:33.645 "assigned_rate_limits": { 00:11:33.645 "rw_ios_per_sec": 0, 00:11:33.645 "rw_mbytes_per_sec": 0, 00:11:33.645 "r_mbytes_per_sec": 0, 00:11:33.645 "w_mbytes_per_sec": 0 00:11:33.645 }, 00:11:33.645 "claimed": true, 00:11:33.645 "claim_type": "exclusive_write", 00:11:33.645 "zoned": false, 00:11:33.645 "supported_io_types": { 00:11:33.645 "read": true, 00:11:33.645 "write": true, 00:11:33.645 "unmap": true, 00:11:33.645 "flush": true, 00:11:33.645 "reset": true, 00:11:33.645 "nvme_admin": false, 00:11:33.645 "nvme_io": false, 00:11:33.645 "nvme_io_md": false, 00:11:33.645 "write_zeroes": true, 00:11:33.645 "zcopy": true, 00:11:33.645 "get_zone_info": false, 00:11:33.645 "zone_management": false, 00:11:33.645 "zone_append": false, 00:11:33.645 "compare": false, 00:11:33.645 "compare_and_write": false, 00:11:33.645 "abort": true, 00:11:33.645 "seek_hole": false, 00:11:33.645 "seek_data": false, 00:11:33.645 "copy": true, 00:11:33.645 "nvme_iov_md": false 00:11:33.645 }, 00:11:33.645 "memory_domains": [ 00:11:33.645 { 00:11:33.645 "dma_device_id": "system", 00:11:33.645 "dma_device_type": 1 00:11:33.645 }, 00:11:33.645 { 00:11:33.645 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.645 "dma_device_type": 2 00:11:33.645 } 00:11:33.645 ], 00:11:33.645 "driver_specific": {} 00:11:33.645 }' 00:11:33.645 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:33.906 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.166 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.166 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:34.166 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:34.166 [2024-07-25 13:20:14.932084] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:34.166 [2024-07-25 13:20:14.932102] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:34.166 [2024-07-25 13:20:14.932132] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:34.166 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:34.166 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:11:34.166 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:34.166 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.167 13:20:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:34.427 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:34.427 "name": "Existed_Raid", 00:11:34.427 "uuid": "e911d66c-9c19-4775-885b-bd1f559c3770", 00:11:34.427 "strip_size_kb": 64, 00:11:34.427 "state": "offline", 00:11:34.427 "raid_level": "raid0", 00:11:34.427 "superblock": true, 00:11:34.427 "num_base_bdevs": 2, 00:11:34.427 "num_base_bdevs_discovered": 1, 00:11:34.427 "num_base_bdevs_operational": 1, 00:11:34.427 "base_bdevs_list": [ 00:11:34.427 { 00:11:34.427 "name": null, 00:11:34.427 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:34.427 "is_configured": false, 00:11:34.427 "data_offset": 2048, 00:11:34.427 "data_size": 63488 00:11:34.427 }, 00:11:34.427 { 00:11:34.427 "name": "BaseBdev2", 00:11:34.427 "uuid": "4cab0c7a-28c2-4d96-82c9-9e885feff6d6", 00:11:34.427 "is_configured": true, 00:11:34.427 "data_offset": 2048, 00:11:34.427 "data_size": 63488 00:11:34.427 } 00:11:34.427 ] 00:11:34.427 }' 00:11:34.427 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:34.427 13:20:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:34.998 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:34.998 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:34.998 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:34.998 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:35.258 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:35.258 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:35.258 13:20:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:35.258 [2024-07-25 13:20:16.022984] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:35.258 [2024-07-25 13:20:16.023021] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fffda0 name Existed_Raid, state offline 00:11:35.258 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:35.258 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:35.258 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.258 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 873139 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 873139 ']' 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 873139 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 873139 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 873139' 00:11:35.519 killing process with pid 873139 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 873139 00:11:35.519 [2024-07-25 13:20:16.299258] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:35.519 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 873139 00:11:35.519 [2024-07-25 13:20:16.299869] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:35.781 13:20:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:35.781 00:11:35.781 real 0m8.931s 00:11:35.781 user 0m16.213s 00:11:35.781 sys 0m1.374s 00:11:35.781 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:35.781 13:20:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:35.781 ************************************ 00:11:35.781 END TEST raid_state_function_test_sb 00:11:35.781 ************************************ 00:11:35.781 13:20:16 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:11:35.781 13:20:16 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:11:35.781 13:20:16 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:35.781 13:20:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:35.781 ************************************ 00:11:35.781 START TEST raid_superblock_test 00:11:35.781 ************************************ 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 2 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=874824 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 874824 /var/tmp/spdk-raid.sock 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 874824 ']' 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:35.781 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:35.781 13:20:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:35.781 [2024-07-25 13:20:16.557354] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:35.781 [2024-07-25 13:20:16.557412] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid874824 ] 00:11:36.042 [2024-07-25 13:20:16.650887] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:36.042 [2024-07-25 13:20:16.725936] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:36.042 [2024-07-25 13:20:16.765734] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.042 [2024-07-25 13:20:16.765759] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:36.982 13:20:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:36.983 malloc1 00:11:36.983 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:36.983 [2024-07-25 13:20:17.768155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:36.983 [2024-07-25 13:20:17.768190] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:36.983 [2024-07-25 13:20:17.768202] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12519b0 00:11:36.983 [2024-07-25 13:20:17.768209] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:36.983 [2024-07-25 13:20:17.769488] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:36.983 [2024-07-25 13:20:17.769510] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:36.983 pt1 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:37.243 malloc2 00:11:37.243 13:20:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:37.503 [2024-07-25 13:20:18.159041] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:37.503 [2024-07-25 13:20:18.159072] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:37.503 [2024-07-25 13:20:18.159080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1252db0 00:11:37.503 [2024-07-25 13:20:18.159087] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:37.503 [2024-07-25 13:20:18.160277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:37.503 [2024-07-25 13:20:18.160297] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:37.503 pt2 00:11:37.503 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:11:37.503 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:11:37.503 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:37.763 [2024-07-25 13:20:18.359571] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:37.763 [2024-07-25 13:20:18.360534] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:37.763 [2024-07-25 13:20:18.360641] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x13f56b0 00:11:37.763 [2024-07-25 13:20:18.360649] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:37.763 [2024-07-25 13:20:18.360796] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x124ab40 00:11:37.763 [2024-07-25 13:20:18.360899] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13f56b0 00:11:37.763 [2024-07-25 13:20:18.360904] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13f56b0 00:11:37.763 [2024-07-25 13:20:18.360981] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.763 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:38.023 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:38.023 "name": "raid_bdev1", 00:11:38.023 "uuid": "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0", 00:11:38.023 "strip_size_kb": 64, 00:11:38.023 "state": "online", 00:11:38.023 "raid_level": "raid0", 00:11:38.023 "superblock": true, 00:11:38.023 "num_base_bdevs": 2, 00:11:38.023 "num_base_bdevs_discovered": 2, 00:11:38.023 "num_base_bdevs_operational": 2, 00:11:38.023 "base_bdevs_list": [ 00:11:38.023 { 00:11:38.023 "name": "pt1", 00:11:38.023 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:38.023 "is_configured": true, 00:11:38.023 "data_offset": 2048, 00:11:38.023 "data_size": 63488 00:11:38.023 }, 00:11:38.023 { 00:11:38.023 "name": "pt2", 00:11:38.023 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:38.023 "is_configured": true, 00:11:38.023 "data_offset": 2048, 00:11:38.023 "data_size": 63488 00:11:38.023 } 00:11:38.023 ] 00:11:38.023 }' 00:11:38.023 13:20:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:38.023 13:20:18 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:38.594 [2024-07-25 13:20:19.294124] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:38.594 "name": "raid_bdev1", 00:11:38.594 "aliases": [ 00:11:38.594 "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0" 00:11:38.594 ], 00:11:38.594 "product_name": "Raid Volume", 00:11:38.594 "block_size": 512, 00:11:38.594 "num_blocks": 126976, 00:11:38.594 "uuid": "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0", 00:11:38.594 "assigned_rate_limits": { 00:11:38.594 "rw_ios_per_sec": 0, 00:11:38.594 "rw_mbytes_per_sec": 0, 00:11:38.594 "r_mbytes_per_sec": 0, 00:11:38.594 "w_mbytes_per_sec": 0 00:11:38.594 }, 00:11:38.594 "claimed": false, 00:11:38.594 "zoned": false, 00:11:38.594 "supported_io_types": { 00:11:38.594 "read": true, 00:11:38.594 "write": true, 00:11:38.594 "unmap": true, 00:11:38.594 "flush": true, 00:11:38.594 "reset": true, 00:11:38.594 "nvme_admin": false, 00:11:38.594 "nvme_io": false, 00:11:38.594 "nvme_io_md": false, 00:11:38.594 "write_zeroes": true, 00:11:38.594 "zcopy": false, 00:11:38.594 "get_zone_info": false, 00:11:38.594 "zone_management": false, 00:11:38.594 "zone_append": false, 00:11:38.594 "compare": false, 00:11:38.594 "compare_and_write": false, 00:11:38.594 "abort": false, 00:11:38.594 "seek_hole": false, 00:11:38.594 "seek_data": false, 00:11:38.594 "copy": false, 00:11:38.594 "nvme_iov_md": false 00:11:38.594 }, 00:11:38.594 "memory_domains": [ 00:11:38.594 { 00:11:38.594 "dma_device_id": "system", 00:11:38.594 "dma_device_type": 1 00:11:38.594 }, 00:11:38.594 { 00:11:38.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.594 "dma_device_type": 2 00:11:38.594 }, 00:11:38.594 { 00:11:38.594 "dma_device_id": "system", 00:11:38.594 "dma_device_type": 1 00:11:38.594 }, 00:11:38.594 { 00:11:38.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.594 "dma_device_type": 2 00:11:38.594 } 00:11:38.594 ], 00:11:38.594 "driver_specific": { 00:11:38.594 "raid": { 00:11:38.594 "uuid": "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0", 00:11:38.594 "strip_size_kb": 64, 00:11:38.594 "state": "online", 00:11:38.594 "raid_level": "raid0", 00:11:38.594 "superblock": true, 00:11:38.594 "num_base_bdevs": 2, 00:11:38.594 "num_base_bdevs_discovered": 2, 00:11:38.594 "num_base_bdevs_operational": 2, 00:11:38.594 "base_bdevs_list": [ 00:11:38.594 { 00:11:38.594 "name": "pt1", 00:11:38.594 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:38.594 "is_configured": true, 00:11:38.594 "data_offset": 2048, 00:11:38.594 "data_size": 63488 00:11:38.594 }, 00:11:38.594 { 00:11:38.594 "name": "pt2", 00:11:38.594 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:38.594 "is_configured": true, 00:11:38.594 "data_offset": 2048, 00:11:38.594 "data_size": 63488 00:11:38.594 } 00:11:38.594 ] 00:11:38.594 } 00:11:38.594 } 00:11:38.594 }' 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:38.594 pt2' 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:38.594 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:38.854 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:38.854 "name": "pt1", 00:11:38.854 "aliases": [ 00:11:38.854 "00000000-0000-0000-0000-000000000001" 00:11:38.854 ], 00:11:38.854 "product_name": "passthru", 00:11:38.854 "block_size": 512, 00:11:38.854 "num_blocks": 65536, 00:11:38.854 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:38.854 "assigned_rate_limits": { 00:11:38.854 "rw_ios_per_sec": 0, 00:11:38.854 "rw_mbytes_per_sec": 0, 00:11:38.854 "r_mbytes_per_sec": 0, 00:11:38.854 "w_mbytes_per_sec": 0 00:11:38.854 }, 00:11:38.854 "claimed": true, 00:11:38.854 "claim_type": "exclusive_write", 00:11:38.854 "zoned": false, 00:11:38.854 "supported_io_types": { 00:11:38.854 "read": true, 00:11:38.855 "write": true, 00:11:38.855 "unmap": true, 00:11:38.855 "flush": true, 00:11:38.855 "reset": true, 00:11:38.855 "nvme_admin": false, 00:11:38.855 "nvme_io": false, 00:11:38.855 "nvme_io_md": false, 00:11:38.855 "write_zeroes": true, 00:11:38.855 "zcopy": true, 00:11:38.855 "get_zone_info": false, 00:11:38.855 "zone_management": false, 00:11:38.855 "zone_append": false, 00:11:38.855 "compare": false, 00:11:38.855 "compare_and_write": false, 00:11:38.855 "abort": true, 00:11:38.855 "seek_hole": false, 00:11:38.855 "seek_data": false, 00:11:38.855 "copy": true, 00:11:38.855 "nvme_iov_md": false 00:11:38.855 }, 00:11:38.855 "memory_domains": [ 00:11:38.855 { 00:11:38.855 "dma_device_id": "system", 00:11:38.855 "dma_device_type": 1 00:11:38.855 }, 00:11:38.855 { 00:11:38.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:38.855 "dma_device_type": 2 00:11:38.855 } 00:11:38.855 ], 00:11:38.855 "driver_specific": { 00:11:38.855 "passthru": { 00:11:38.855 "name": "pt1", 00:11:38.855 "base_bdev_name": "malloc1" 00:11:38.855 } 00:11:38.855 } 00:11:38.855 }' 00:11:38.855 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:38.855 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.114 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.114 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.115 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.115 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.115 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.115 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.115 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.115 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.115 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.375 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:39.375 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:39.375 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:39.375 13:20:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:39.375 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:39.375 "name": "pt2", 00:11:39.375 "aliases": [ 00:11:39.375 "00000000-0000-0000-0000-000000000002" 00:11:39.375 ], 00:11:39.375 "product_name": "passthru", 00:11:39.375 "block_size": 512, 00:11:39.375 "num_blocks": 65536, 00:11:39.375 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:39.375 "assigned_rate_limits": { 00:11:39.375 "rw_ios_per_sec": 0, 00:11:39.375 "rw_mbytes_per_sec": 0, 00:11:39.375 "r_mbytes_per_sec": 0, 00:11:39.375 "w_mbytes_per_sec": 0 00:11:39.375 }, 00:11:39.375 "claimed": true, 00:11:39.375 "claim_type": "exclusive_write", 00:11:39.375 "zoned": false, 00:11:39.375 "supported_io_types": { 00:11:39.375 "read": true, 00:11:39.375 "write": true, 00:11:39.375 "unmap": true, 00:11:39.375 "flush": true, 00:11:39.375 "reset": true, 00:11:39.375 "nvme_admin": false, 00:11:39.375 "nvme_io": false, 00:11:39.375 "nvme_io_md": false, 00:11:39.375 "write_zeroes": true, 00:11:39.375 "zcopy": true, 00:11:39.375 "get_zone_info": false, 00:11:39.375 "zone_management": false, 00:11:39.375 "zone_append": false, 00:11:39.375 "compare": false, 00:11:39.375 "compare_and_write": false, 00:11:39.375 "abort": true, 00:11:39.375 "seek_hole": false, 00:11:39.375 "seek_data": false, 00:11:39.375 "copy": true, 00:11:39.375 "nvme_iov_md": false 00:11:39.375 }, 00:11:39.375 "memory_domains": [ 00:11:39.375 { 00:11:39.375 "dma_device_id": "system", 00:11:39.375 "dma_device_type": 1 00:11:39.375 }, 00:11:39.375 { 00:11:39.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:39.375 "dma_device_type": 2 00:11:39.375 } 00:11:39.375 ], 00:11:39.375 "driver_specific": { 00:11:39.375 "passthru": { 00:11:39.375 "name": "pt2", 00:11:39.375 "base_bdev_name": "malloc2" 00:11:39.375 } 00:11:39.375 } 00:11:39.375 }' 00:11:39.375 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.636 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:39.897 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:39.897 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:39.897 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:11:39.897 [2024-07-25 13:20:20.633506] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:39.897 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0 00:11:39.897 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0 ']' 00:11:39.897 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:40.157 [2024-07-25 13:20:20.825793] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:40.157 [2024-07-25 13:20:20.825804] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:40.157 [2024-07-25 13:20:20.825842] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:40.157 [2024-07-25 13:20:20.825873] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:40.157 [2024-07-25 13:20:20.825879] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13f56b0 name raid_bdev1, state offline 00:11:40.157 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:40.157 13:20:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:11:40.418 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:11:40.418 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:11:40.418 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:11:40.418 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:40.678 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:11:40.678 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:40.678 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:40.678 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:40.939 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:11:41.201 [2024-07-25 13:20:21.796222] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:41.201 [2024-07-25 13:20:21.797294] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:41.201 [2024-07-25 13:20:21.797338] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:41.201 [2024-07-25 13:20:21.797367] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:41.201 [2024-07-25 13:20:21.797379] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:41.201 [2024-07-25 13:20:21.797384] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1251e50 name raid_bdev1, state configuring 00:11:41.201 request: 00:11:41.201 { 00:11:41.201 "name": "raid_bdev1", 00:11:41.201 "raid_level": "raid0", 00:11:41.201 "base_bdevs": [ 00:11:41.201 "malloc1", 00:11:41.201 "malloc2" 00:11:41.201 ], 00:11:41.201 "strip_size_kb": 64, 00:11:41.201 "superblock": false, 00:11:41.201 "method": "bdev_raid_create", 00:11:41.201 "req_id": 1 00:11:41.201 } 00:11:41.201 Got JSON-RPC error response 00:11:41.201 response: 00:11:41.201 { 00:11:41.201 "code": -17, 00:11:41.201 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:41.201 } 00:11:41.201 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:11:41.201 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:11:41.201 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:11:41.201 13:20:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:11:41.201 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.201 13:20:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:41.463 [2024-07-25 13:20:22.189176] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:41.463 [2024-07-25 13:20:22.189210] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:41.463 [2024-07-25 13:20:22.189223] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1251be0 00:11:41.463 [2024-07-25 13:20:22.189229] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:41.463 [2024-07-25 13:20:22.190495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:41.463 [2024-07-25 13:20:22.190516] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:41.463 [2024-07-25 13:20:22.190571] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:41.463 [2024-07-25 13:20:22.190590] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:41.463 pt1 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:41.463 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:41.724 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:41.724 "name": "raid_bdev1", 00:11:41.724 "uuid": "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0", 00:11:41.724 "strip_size_kb": 64, 00:11:41.724 "state": "configuring", 00:11:41.724 "raid_level": "raid0", 00:11:41.724 "superblock": true, 00:11:41.724 "num_base_bdevs": 2, 00:11:41.724 "num_base_bdevs_discovered": 1, 00:11:41.724 "num_base_bdevs_operational": 2, 00:11:41.724 "base_bdevs_list": [ 00:11:41.724 { 00:11:41.724 "name": "pt1", 00:11:41.724 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:41.724 "is_configured": true, 00:11:41.724 "data_offset": 2048, 00:11:41.724 "data_size": 63488 00:11:41.724 }, 00:11:41.724 { 00:11:41.724 "name": null, 00:11:41.724 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:41.724 "is_configured": false, 00:11:41.724 "data_offset": 2048, 00:11:41.724 "data_size": 63488 00:11:41.724 } 00:11:41.724 ] 00:11:41.724 }' 00:11:41.724 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:41.724 13:20:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:42.295 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:11:42.295 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:11:42.295 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:11:42.295 13:20:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:42.556 [2024-07-25 13:20:23.115536] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:42.556 [2024-07-25 13:20:23.115580] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:42.556 [2024-07-25 13:20:23.115592] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e9380 00:11:42.556 [2024-07-25 13:20:23.115606] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:42.556 [2024-07-25 13:20:23.115884] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:42.556 [2024-07-25 13:20:23.115895] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:42.556 [2024-07-25 13:20:23.115939] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:42.556 [2024-07-25 13:20:23.115953] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:42.556 [2024-07-25 13:20:23.116028] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x13ea100 00:11:42.556 [2024-07-25 13:20:23.116035] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:42.556 [2024-07-25 13:20:23.116168] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x124a640 00:11:42.556 [2024-07-25 13:20:23.116266] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13ea100 00:11:42.556 [2024-07-25 13:20:23.116272] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13ea100 00:11:42.556 [2024-07-25 13:20:23.116345] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:42.556 pt2 00:11:42.556 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:11:42.556 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:42.557 "name": "raid_bdev1", 00:11:42.557 "uuid": "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0", 00:11:42.557 "strip_size_kb": 64, 00:11:42.557 "state": "online", 00:11:42.557 "raid_level": "raid0", 00:11:42.557 "superblock": true, 00:11:42.557 "num_base_bdevs": 2, 00:11:42.557 "num_base_bdevs_discovered": 2, 00:11:42.557 "num_base_bdevs_operational": 2, 00:11:42.557 "base_bdevs_list": [ 00:11:42.557 { 00:11:42.557 "name": "pt1", 00:11:42.557 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:42.557 "is_configured": true, 00:11:42.557 "data_offset": 2048, 00:11:42.557 "data_size": 63488 00:11:42.557 }, 00:11:42.557 { 00:11:42.557 "name": "pt2", 00:11:42.557 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:42.557 "is_configured": true, 00:11:42.557 "data_offset": 2048, 00:11:42.557 "data_size": 63488 00:11:42.557 } 00:11:42.557 ] 00:11:42.557 }' 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:42.557 13:20:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:43.128 13:20:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:43.388 [2024-07-25 13:20:24.010092] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:43.388 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:43.388 "name": "raid_bdev1", 00:11:43.388 "aliases": [ 00:11:43.388 "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0" 00:11:43.388 ], 00:11:43.388 "product_name": "Raid Volume", 00:11:43.388 "block_size": 512, 00:11:43.388 "num_blocks": 126976, 00:11:43.388 "uuid": "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0", 00:11:43.388 "assigned_rate_limits": { 00:11:43.388 "rw_ios_per_sec": 0, 00:11:43.388 "rw_mbytes_per_sec": 0, 00:11:43.388 "r_mbytes_per_sec": 0, 00:11:43.388 "w_mbytes_per_sec": 0 00:11:43.388 }, 00:11:43.388 "claimed": false, 00:11:43.388 "zoned": false, 00:11:43.388 "supported_io_types": { 00:11:43.388 "read": true, 00:11:43.388 "write": true, 00:11:43.388 "unmap": true, 00:11:43.388 "flush": true, 00:11:43.388 "reset": true, 00:11:43.388 "nvme_admin": false, 00:11:43.388 "nvme_io": false, 00:11:43.388 "nvme_io_md": false, 00:11:43.388 "write_zeroes": true, 00:11:43.388 "zcopy": false, 00:11:43.388 "get_zone_info": false, 00:11:43.388 "zone_management": false, 00:11:43.388 "zone_append": false, 00:11:43.388 "compare": false, 00:11:43.388 "compare_and_write": false, 00:11:43.388 "abort": false, 00:11:43.388 "seek_hole": false, 00:11:43.388 "seek_data": false, 00:11:43.388 "copy": false, 00:11:43.388 "nvme_iov_md": false 00:11:43.388 }, 00:11:43.388 "memory_domains": [ 00:11:43.388 { 00:11:43.388 "dma_device_id": "system", 00:11:43.388 "dma_device_type": 1 00:11:43.388 }, 00:11:43.388 { 00:11:43.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.388 "dma_device_type": 2 00:11:43.388 }, 00:11:43.388 { 00:11:43.388 "dma_device_id": "system", 00:11:43.388 "dma_device_type": 1 00:11:43.388 }, 00:11:43.388 { 00:11:43.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.388 "dma_device_type": 2 00:11:43.388 } 00:11:43.388 ], 00:11:43.388 "driver_specific": { 00:11:43.388 "raid": { 00:11:43.388 "uuid": "2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0", 00:11:43.388 "strip_size_kb": 64, 00:11:43.388 "state": "online", 00:11:43.388 "raid_level": "raid0", 00:11:43.388 "superblock": true, 00:11:43.388 "num_base_bdevs": 2, 00:11:43.388 "num_base_bdevs_discovered": 2, 00:11:43.388 "num_base_bdevs_operational": 2, 00:11:43.388 "base_bdevs_list": [ 00:11:43.388 { 00:11:43.388 "name": "pt1", 00:11:43.388 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:43.388 "is_configured": true, 00:11:43.388 "data_offset": 2048, 00:11:43.388 "data_size": 63488 00:11:43.388 }, 00:11:43.388 { 00:11:43.388 "name": "pt2", 00:11:43.388 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:43.388 "is_configured": true, 00:11:43.388 "data_offset": 2048, 00:11:43.388 "data_size": 63488 00:11:43.388 } 00:11:43.388 ] 00:11:43.388 } 00:11:43.388 } 00:11:43.388 }' 00:11:43.388 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:43.388 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:43.388 pt2' 00:11:43.388 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:43.388 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:43.388 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:43.649 "name": "pt1", 00:11:43.649 "aliases": [ 00:11:43.649 "00000000-0000-0000-0000-000000000001" 00:11:43.649 ], 00:11:43.649 "product_name": "passthru", 00:11:43.649 "block_size": 512, 00:11:43.649 "num_blocks": 65536, 00:11:43.649 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:43.649 "assigned_rate_limits": { 00:11:43.649 "rw_ios_per_sec": 0, 00:11:43.649 "rw_mbytes_per_sec": 0, 00:11:43.649 "r_mbytes_per_sec": 0, 00:11:43.649 "w_mbytes_per_sec": 0 00:11:43.649 }, 00:11:43.649 "claimed": true, 00:11:43.649 "claim_type": "exclusive_write", 00:11:43.649 "zoned": false, 00:11:43.649 "supported_io_types": { 00:11:43.649 "read": true, 00:11:43.649 "write": true, 00:11:43.649 "unmap": true, 00:11:43.649 "flush": true, 00:11:43.649 "reset": true, 00:11:43.649 "nvme_admin": false, 00:11:43.649 "nvme_io": false, 00:11:43.649 "nvme_io_md": false, 00:11:43.649 "write_zeroes": true, 00:11:43.649 "zcopy": true, 00:11:43.649 "get_zone_info": false, 00:11:43.649 "zone_management": false, 00:11:43.649 "zone_append": false, 00:11:43.649 "compare": false, 00:11:43.649 "compare_and_write": false, 00:11:43.649 "abort": true, 00:11:43.649 "seek_hole": false, 00:11:43.649 "seek_data": false, 00:11:43.649 "copy": true, 00:11:43.649 "nvme_iov_md": false 00:11:43.649 }, 00:11:43.649 "memory_domains": [ 00:11:43.649 { 00:11:43.649 "dma_device_id": "system", 00:11:43.649 "dma_device_type": 1 00:11:43.649 }, 00:11:43.649 { 00:11:43.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:43.649 "dma_device_type": 2 00:11:43.649 } 00:11:43.649 ], 00:11:43.649 "driver_specific": { 00:11:43.649 "passthru": { 00:11:43.649 "name": "pt1", 00:11:43.649 "base_bdev_name": "malloc1" 00:11:43.649 } 00:11:43.649 } 00:11:43.649 }' 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:43.649 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:43.910 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:44.170 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:44.170 "name": "pt2", 00:11:44.170 "aliases": [ 00:11:44.170 "00000000-0000-0000-0000-000000000002" 00:11:44.170 ], 00:11:44.170 "product_name": "passthru", 00:11:44.170 "block_size": 512, 00:11:44.170 "num_blocks": 65536, 00:11:44.170 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:44.170 "assigned_rate_limits": { 00:11:44.170 "rw_ios_per_sec": 0, 00:11:44.170 "rw_mbytes_per_sec": 0, 00:11:44.170 "r_mbytes_per_sec": 0, 00:11:44.170 "w_mbytes_per_sec": 0 00:11:44.170 }, 00:11:44.170 "claimed": true, 00:11:44.170 "claim_type": "exclusive_write", 00:11:44.170 "zoned": false, 00:11:44.170 "supported_io_types": { 00:11:44.170 "read": true, 00:11:44.170 "write": true, 00:11:44.170 "unmap": true, 00:11:44.170 "flush": true, 00:11:44.171 "reset": true, 00:11:44.171 "nvme_admin": false, 00:11:44.171 "nvme_io": false, 00:11:44.171 "nvme_io_md": false, 00:11:44.171 "write_zeroes": true, 00:11:44.171 "zcopy": true, 00:11:44.171 "get_zone_info": false, 00:11:44.171 "zone_management": false, 00:11:44.171 "zone_append": false, 00:11:44.171 "compare": false, 00:11:44.171 "compare_and_write": false, 00:11:44.171 "abort": true, 00:11:44.171 "seek_hole": false, 00:11:44.171 "seek_data": false, 00:11:44.171 "copy": true, 00:11:44.171 "nvme_iov_md": false 00:11:44.171 }, 00:11:44.171 "memory_domains": [ 00:11:44.171 { 00:11:44.171 "dma_device_id": "system", 00:11:44.171 "dma_device_type": 1 00:11:44.171 }, 00:11:44.171 { 00:11:44.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:44.171 "dma_device_type": 2 00:11:44.171 } 00:11:44.171 ], 00:11:44.171 "driver_specific": { 00:11:44.171 "passthru": { 00:11:44.171 "name": "pt2", 00:11:44.171 "base_bdev_name": "malloc2" 00:11:44.171 } 00:11:44.171 } 00:11:44.171 }' 00:11:44.171 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.171 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:44.171 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:44.171 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.171 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:44.171 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:44.171 13:20:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.431 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:44.431 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:44.431 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.431 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:44.431 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:44.431 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:44.431 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:11:44.692 [2024-07-25 13:20:25.313385] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0 '!=' 2be180a9-0ba3-4a50-9d2a-2ef2a0ab32a0 ']' 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 874824 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 874824 ']' 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 874824 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 874824 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 874824' 00:11:44.692 killing process with pid 874824 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 874824 00:11:44.692 [2024-07-25 13:20:25.383229] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:44.692 [2024-07-25 13:20:25.383267] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:44.692 [2024-07-25 13:20:25.383300] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:44.692 [2024-07-25 13:20:25.383306] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13ea100 name raid_bdev1, state offline 00:11:44.692 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 874824 00:11:44.692 [2024-07-25 13:20:25.392645] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:44.953 13:20:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:11:44.953 00:11:44.953 real 0m9.012s 00:11:44.953 user 0m16.450s 00:11:44.953 sys 0m1.364s 00:11:44.953 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:44.953 13:20:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.953 ************************************ 00:11:44.953 END TEST raid_superblock_test 00:11:44.953 ************************************ 00:11:44.953 13:20:25 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:11:44.953 13:20:25 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:44.953 13:20:25 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:44.953 13:20:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:44.953 ************************************ 00:11:44.953 START TEST raid_read_error_test 00:11:44.953 ************************************ 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 read 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.auhGTj8PlL 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=876560 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 876560 /var/tmp/spdk-raid.sock 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 876560 ']' 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:44.953 13:20:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:44.953 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:44.954 13:20:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:44.954 13:20:25 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.954 [2024-07-25 13:20:25.659014] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:44.954 [2024-07-25 13:20:25.659062] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid876560 ] 00:11:45.215 [2024-07-25 13:20:25.745707] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:45.215 [2024-07-25 13:20:25.810257] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:45.215 [2024-07-25 13:20:25.851868] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.215 [2024-07-25 13:20:25.851891] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:45.849 13:20:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:45.849 13:20:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:11:45.849 13:20:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:11:45.849 13:20:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:46.127 BaseBdev1_malloc 00:11:46.127 13:20:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:46.127 true 00:11:46.127 13:20:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:46.387 [2024-07-25 13:20:27.046339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:46.387 [2024-07-25 13:20:27.046372] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:46.387 [2024-07-25 13:20:27.046384] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17882a0 00:11:46.387 [2024-07-25 13:20:27.046390] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:46.387 [2024-07-25 13:20:27.047683] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:46.387 [2024-07-25 13:20:27.047703] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:46.388 BaseBdev1 00:11:46.388 13:20:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:11:46.388 13:20:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:46.959 BaseBdev2_malloc 00:11:46.959 13:20:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:47.219 true 00:11:47.219 13:20:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:47.219 [2024-07-25 13:20:27.978667] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:47.219 [2024-07-25 13:20:27.978696] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:47.219 [2024-07-25 13:20:27.978708] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1847420 00:11:47.219 [2024-07-25 13:20:27.978714] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:47.219 [2024-07-25 13:20:27.979903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:47.219 [2024-07-25 13:20:27.979923] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:47.219 BaseBdev2 00:11:47.219 13:20:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:47.789 [2024-07-25 13:20:28.503998] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:47.789 [2024-07-25 13:20:28.505012] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:47.789 [2024-07-25 13:20:28.505141] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x18476c0 00:11:47.789 [2024-07-25 13:20:28.505148] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:47.789 [2024-07-25 13:20:28.505296] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x184c6f0 00:11:47.789 [2024-07-25 13:20:28.505407] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18476c0 00:11:47.789 [2024-07-25 13:20:28.505412] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18476c0 00:11:47.789 [2024-07-25 13:20:28.505499] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:47.789 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:48.049 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:48.049 "name": "raid_bdev1", 00:11:48.049 "uuid": "692d300f-35e3-4d50-9a05-1b13e78cb79d", 00:11:48.049 "strip_size_kb": 64, 00:11:48.049 "state": "online", 00:11:48.049 "raid_level": "raid0", 00:11:48.049 "superblock": true, 00:11:48.049 "num_base_bdevs": 2, 00:11:48.049 "num_base_bdevs_discovered": 2, 00:11:48.049 "num_base_bdevs_operational": 2, 00:11:48.049 "base_bdevs_list": [ 00:11:48.049 { 00:11:48.049 "name": "BaseBdev1", 00:11:48.049 "uuid": "b18e2e46-6e5c-5959-b912-c3468ce7f16d", 00:11:48.049 "is_configured": true, 00:11:48.049 "data_offset": 2048, 00:11:48.049 "data_size": 63488 00:11:48.049 }, 00:11:48.049 { 00:11:48.049 "name": "BaseBdev2", 00:11:48.049 "uuid": "05475cb5-e54c-5b89-becf-0b88244967da", 00:11:48.049 "is_configured": true, 00:11:48.049 "data_offset": 2048, 00:11:48.049 "data_size": 63488 00:11:48.049 } 00:11:48.049 ] 00:11:48.049 }' 00:11:48.049 13:20:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:48.049 13:20:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:48.619 13:20:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:11:48.619 13:20:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:48.619 [2024-07-25 13:20:29.370422] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1786fc0 00:11:49.559 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:49.819 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:50.115 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.115 "name": "raid_bdev1", 00:11:50.115 "uuid": "692d300f-35e3-4d50-9a05-1b13e78cb79d", 00:11:50.115 "strip_size_kb": 64, 00:11:50.115 "state": "online", 00:11:50.115 "raid_level": "raid0", 00:11:50.115 "superblock": true, 00:11:50.115 "num_base_bdevs": 2, 00:11:50.115 "num_base_bdevs_discovered": 2, 00:11:50.115 "num_base_bdevs_operational": 2, 00:11:50.115 "base_bdevs_list": [ 00:11:50.115 { 00:11:50.115 "name": "BaseBdev1", 00:11:50.115 "uuid": "b18e2e46-6e5c-5959-b912-c3468ce7f16d", 00:11:50.115 "is_configured": true, 00:11:50.115 "data_offset": 2048, 00:11:50.115 "data_size": 63488 00:11:50.115 }, 00:11:50.115 { 00:11:50.115 "name": "BaseBdev2", 00:11:50.115 "uuid": "05475cb5-e54c-5b89-becf-0b88244967da", 00:11:50.115 "is_configured": true, 00:11:50.115 "data_offset": 2048, 00:11:50.115 "data_size": 63488 00:11:50.115 } 00:11:50.115 ] 00:11:50.115 }' 00:11:50.115 13:20:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.115 13:20:30 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:50.684 [2024-07-25 13:20:31.393718] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:50.684 [2024-07-25 13:20:31.393746] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:50.684 [2024-07-25 13:20:31.396326] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:50.684 [2024-07-25 13:20:31.396347] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.684 [2024-07-25 13:20:31.396365] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:50.684 [2024-07-25 13:20:31.396371] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18476c0 name raid_bdev1, state offline 00:11:50.684 0 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 876560 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 876560 ']' 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 876560 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 876560 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 876560' 00:11:50.684 killing process with pid 876560 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 876560 00:11:50.684 [2024-07-25 13:20:31.462264] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:50.684 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 876560 00:11:50.684 [2024-07-25 13:20:31.468094] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.auhGTj8PlL 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:50.944 00:11:50.944 real 0m6.011s 00:11:50.944 user 0m9.639s 00:11:50.944 sys 0m0.853s 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:50.944 13:20:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:50.944 ************************************ 00:11:50.944 END TEST raid_read_error_test 00:11:50.944 ************************************ 00:11:50.944 13:20:31 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:11:50.944 13:20:31 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:50.944 13:20:31 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:50.944 13:20:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:50.944 ************************************ 00:11:50.944 START TEST raid_write_error_test 00:11:50.944 ************************************ 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 2 write 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:11:50.944 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.5h6OuOlSYp 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=877669 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 877669 /var/tmp/spdk-raid.sock 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 877669 ']' 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:50.945 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:50.945 13:20:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.204 [2024-07-25 13:20:31.744637] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:51.204 [2024-07-25 13:20:31.744686] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid877669 ] 00:11:51.204 [2024-07-25 13:20:31.831608] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:51.204 [2024-07-25 13:20:31.896504] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:51.204 [2024-07-25 13:20:31.936600] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:51.204 [2024-07-25 13:20:31.936624] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:52.144 13:20:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:52.144 13:20:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:11:52.144 13:20:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:11:52.144 13:20:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:52.144 BaseBdev1_malloc 00:11:52.144 13:20:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:52.404 true 00:11:52.404 13:20:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:52.404 [2024-07-25 13:20:33.143106] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:52.404 [2024-07-25 13:20:33.143138] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:52.404 [2024-07-25 13:20:33.143150] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x143a2a0 00:11:52.404 [2024-07-25 13:20:33.143157] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:52.404 [2024-07-25 13:20:33.144652] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:52.404 [2024-07-25 13:20:33.144673] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:52.404 BaseBdev1 00:11:52.404 13:20:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:11:52.404 13:20:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:52.974 BaseBdev2_malloc 00:11:52.974 13:20:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:53.233 true 00:11:53.233 13:20:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:53.494 [2024-07-25 13:20:34.059361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:53.494 [2024-07-25 13:20:34.059390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:53.494 [2024-07-25 13:20:34.059402] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14f9420 00:11:53.494 [2024-07-25 13:20:34.059409] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:53.494 [2024-07-25 13:20:34.060596] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:53.494 [2024-07-25 13:20:34.060616] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:53.494 BaseBdev2 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:53.494 [2024-07-25 13:20:34.251870] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:53.494 [2024-07-25 13:20:34.252880] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:53.494 [2024-07-25 13:20:34.253012] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x14f96c0 00:11:53.494 [2024-07-25 13:20:34.253020] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:53.494 [2024-07-25 13:20:34.253169] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14fe6f0 00:11:53.494 [2024-07-25 13:20:34.253284] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14f96c0 00:11:53.494 [2024-07-25 13:20:34.253290] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14f96c0 00:11:53.494 [2024-07-25 13:20:34.253374] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:53.494 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:53.754 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:53.754 "name": "raid_bdev1", 00:11:53.754 "uuid": "50da4c80-b223-4c54-84dd-00ab57b500e7", 00:11:53.754 "strip_size_kb": 64, 00:11:53.754 "state": "online", 00:11:53.754 "raid_level": "raid0", 00:11:53.754 "superblock": true, 00:11:53.754 "num_base_bdevs": 2, 00:11:53.754 "num_base_bdevs_discovered": 2, 00:11:53.754 "num_base_bdevs_operational": 2, 00:11:53.754 "base_bdevs_list": [ 00:11:53.754 { 00:11:53.754 "name": "BaseBdev1", 00:11:53.754 "uuid": "13ce86c0-7a46-573c-9acf-66295668935a", 00:11:53.754 "is_configured": true, 00:11:53.754 "data_offset": 2048, 00:11:53.754 "data_size": 63488 00:11:53.754 }, 00:11:53.754 { 00:11:53.754 "name": "BaseBdev2", 00:11:53.754 "uuid": "b6ebf035-098c-5b1e-9b6d-09691969de74", 00:11:53.754 "is_configured": true, 00:11:53.754 "data_offset": 2048, 00:11:53.754 "data_size": 63488 00:11:53.754 } 00:11:53.754 ] 00:11:53.754 }' 00:11:53.754 13:20:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:53.755 13:20:34 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:54.324 13:20:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:11:54.324 13:20:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:54.324 [2024-07-25 13:20:35.114292] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1438fc0 00:11:55.260 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:55.519 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:55.520 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:55.520 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.520 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.520 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.520 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.520 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.520 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:55.779 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.779 "name": "raid_bdev1", 00:11:55.779 "uuid": "50da4c80-b223-4c54-84dd-00ab57b500e7", 00:11:55.779 "strip_size_kb": 64, 00:11:55.779 "state": "online", 00:11:55.779 "raid_level": "raid0", 00:11:55.779 "superblock": true, 00:11:55.779 "num_base_bdevs": 2, 00:11:55.779 "num_base_bdevs_discovered": 2, 00:11:55.779 "num_base_bdevs_operational": 2, 00:11:55.779 "base_bdevs_list": [ 00:11:55.779 { 00:11:55.779 "name": "BaseBdev1", 00:11:55.779 "uuid": "13ce86c0-7a46-573c-9acf-66295668935a", 00:11:55.779 "is_configured": true, 00:11:55.779 "data_offset": 2048, 00:11:55.779 "data_size": 63488 00:11:55.779 }, 00:11:55.779 { 00:11:55.779 "name": "BaseBdev2", 00:11:55.779 "uuid": "b6ebf035-098c-5b1e-9b6d-09691969de74", 00:11:55.779 "is_configured": true, 00:11:55.779 "data_offset": 2048, 00:11:55.779 "data_size": 63488 00:11:55.779 } 00:11:55.779 ] 00:11:55.779 }' 00:11:55.779 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.779 13:20:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.346 13:20:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:56.346 [2024-07-25 13:20:37.119690] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:56.346 [2024-07-25 13:20:37.119716] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:56.346 [2024-07-25 13:20:37.122345] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:56.346 [2024-07-25 13:20:37.122366] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:56.346 [2024-07-25 13:20:37.122384] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:56.346 [2024-07-25 13:20:37.122390] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14f96c0 name raid_bdev1, state offline 00:11:56.346 0 00:11:56.604 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 877669 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 877669 ']' 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 877669 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 877669 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 877669' 00:11:56.605 killing process with pid 877669 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 877669 00:11:56.605 [2024-07-25 13:20:37.207316] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 877669 00:11:56.605 [2024-07-25 13:20:37.212925] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.5h6OuOlSYp 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:11:56.605 00:11:56.605 real 0m5.668s 00:11:56.605 user 0m9.010s 00:11:56.605 sys 0m0.789s 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:11:56.605 13:20:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.605 ************************************ 00:11:56.605 END TEST raid_write_error_test 00:11:56.605 ************************************ 00:11:56.605 13:20:37 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:11:56.605 13:20:37 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:11:56.605 13:20:37 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:11:56.605 13:20:37 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:11:56.605 13:20:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:56.864 ************************************ 00:11:56.864 START TEST raid_state_function_test 00:11:56.864 ************************************ 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 false 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:56.864 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=878686 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 878686' 00:11:56.865 Process raid pid: 878686 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 878686 /var/tmp/spdk-raid.sock 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 878686 ']' 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:56.865 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:11:56.865 13:20:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.865 [2024-07-25 13:20:37.483726] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:11:56.865 [2024-07-25 13:20:37.483781] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:56.865 [2024-07-25 13:20:37.577974] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:56.865 [2024-07-25 13:20:37.653494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:11:57.124 [2024-07-25 13:20:37.699831] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.124 [2024-07-25 13:20:37.699857] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:57.693 13:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:11:57.693 13:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:11:57.693 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:57.693 [2024-07-25 13:20:38.476310] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:57.693 [2024-07-25 13:20:38.476345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:57.693 [2024-07-25 13:20:38.476351] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:57.693 [2024-07-25 13:20:38.476357] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.952 "name": "Existed_Raid", 00:11:57.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.952 "strip_size_kb": 64, 00:11:57.952 "state": "configuring", 00:11:57.952 "raid_level": "concat", 00:11:57.952 "superblock": false, 00:11:57.952 "num_base_bdevs": 2, 00:11:57.952 "num_base_bdevs_discovered": 0, 00:11:57.952 "num_base_bdevs_operational": 2, 00:11:57.952 "base_bdevs_list": [ 00:11:57.952 { 00:11:57.952 "name": "BaseBdev1", 00:11:57.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.952 "is_configured": false, 00:11:57.952 "data_offset": 0, 00:11:57.952 "data_size": 0 00:11:57.952 }, 00:11:57.952 { 00:11:57.952 "name": "BaseBdev2", 00:11:57.952 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.952 "is_configured": false, 00:11:57.952 "data_offset": 0, 00:11:57.952 "data_size": 0 00:11:57.952 } 00:11:57.952 ] 00:11:57.952 }' 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.952 13:20:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.521 13:20:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:58.780 [2024-07-25 13:20:39.378500] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:58.780 [2024-07-25 13:20:39.378525] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x211d6b0 name Existed_Raid, state configuring 00:11:58.780 13:20:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:59.040 [2024-07-25 13:20:39.575000] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:59.040 [2024-07-25 13:20:39.575020] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:59.040 [2024-07-25 13:20:39.575025] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:59.040 [2024-07-25 13:20:39.575031] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:59.040 [2024-07-25 13:20:39.778158] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:59.040 BaseBdev1 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:11:59.040 13:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:59.300 13:20:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:59.559 [ 00:11:59.559 { 00:11:59.559 "name": "BaseBdev1", 00:11:59.559 "aliases": [ 00:11:59.559 "e6596c37-e4c0-4289-baa9-ff6e630d9849" 00:11:59.559 ], 00:11:59.559 "product_name": "Malloc disk", 00:11:59.559 "block_size": 512, 00:11:59.559 "num_blocks": 65536, 00:11:59.559 "uuid": "e6596c37-e4c0-4289-baa9-ff6e630d9849", 00:11:59.559 "assigned_rate_limits": { 00:11:59.559 "rw_ios_per_sec": 0, 00:11:59.560 "rw_mbytes_per_sec": 0, 00:11:59.560 "r_mbytes_per_sec": 0, 00:11:59.560 "w_mbytes_per_sec": 0 00:11:59.560 }, 00:11:59.560 "claimed": true, 00:11:59.560 "claim_type": "exclusive_write", 00:11:59.560 "zoned": false, 00:11:59.560 "supported_io_types": { 00:11:59.560 "read": true, 00:11:59.560 "write": true, 00:11:59.560 "unmap": true, 00:11:59.560 "flush": true, 00:11:59.560 "reset": true, 00:11:59.560 "nvme_admin": false, 00:11:59.560 "nvme_io": false, 00:11:59.560 "nvme_io_md": false, 00:11:59.560 "write_zeroes": true, 00:11:59.560 "zcopy": true, 00:11:59.560 "get_zone_info": false, 00:11:59.560 "zone_management": false, 00:11:59.560 "zone_append": false, 00:11:59.560 "compare": false, 00:11:59.560 "compare_and_write": false, 00:11:59.560 "abort": true, 00:11:59.560 "seek_hole": false, 00:11:59.560 "seek_data": false, 00:11:59.560 "copy": true, 00:11:59.560 "nvme_iov_md": false 00:11:59.560 }, 00:11:59.560 "memory_domains": [ 00:11:59.560 { 00:11:59.560 "dma_device_id": "system", 00:11:59.560 "dma_device_type": 1 00:11:59.560 }, 00:11:59.560 { 00:11:59.560 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:59.560 "dma_device_type": 2 00:11:59.560 } 00:11:59.560 ], 00:11:59.560 "driver_specific": {} 00:11:59.560 } 00:11:59.560 ] 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:59.560 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:59.819 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:59.819 "name": "Existed_Raid", 00:11:59.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:59.819 "strip_size_kb": 64, 00:11:59.819 "state": "configuring", 00:11:59.819 "raid_level": "concat", 00:11:59.819 "superblock": false, 00:11:59.819 "num_base_bdevs": 2, 00:11:59.819 "num_base_bdevs_discovered": 1, 00:11:59.819 "num_base_bdevs_operational": 2, 00:11:59.819 "base_bdevs_list": [ 00:11:59.819 { 00:11:59.819 "name": "BaseBdev1", 00:11:59.819 "uuid": "e6596c37-e4c0-4289-baa9-ff6e630d9849", 00:11:59.819 "is_configured": true, 00:11:59.819 "data_offset": 0, 00:11:59.819 "data_size": 65536 00:11:59.819 }, 00:11:59.819 { 00:11:59.819 "name": "BaseBdev2", 00:11:59.819 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:59.819 "is_configured": false, 00:11:59.819 "data_offset": 0, 00:11:59.819 "data_size": 0 00:11:59.819 } 00:11:59.819 ] 00:11:59.819 }' 00:11:59.819 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:59.819 13:20:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:00.078 13:20:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:00.337 [2024-07-25 13:20:41.041363] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:00.337 [2024-07-25 13:20:41.041396] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x211cfa0 name Existed_Raid, state configuring 00:12:00.337 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:00.597 [2024-07-25 13:20:41.241902] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:00.597 [2024-07-25 13:20:41.243055] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:00.597 [2024-07-25 13:20:41.243082] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.597 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.856 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.856 "name": "Existed_Raid", 00:12:00.856 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.856 "strip_size_kb": 64, 00:12:00.856 "state": "configuring", 00:12:00.856 "raid_level": "concat", 00:12:00.856 "superblock": false, 00:12:00.856 "num_base_bdevs": 2, 00:12:00.856 "num_base_bdevs_discovered": 1, 00:12:00.856 "num_base_bdevs_operational": 2, 00:12:00.856 "base_bdevs_list": [ 00:12:00.856 { 00:12:00.856 "name": "BaseBdev1", 00:12:00.856 "uuid": "e6596c37-e4c0-4289-baa9-ff6e630d9849", 00:12:00.856 "is_configured": true, 00:12:00.857 "data_offset": 0, 00:12:00.857 "data_size": 65536 00:12:00.857 }, 00:12:00.857 { 00:12:00.857 "name": "BaseBdev2", 00:12:00.857 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.857 "is_configured": false, 00:12:00.857 "data_offset": 0, 00:12:00.857 "data_size": 0 00:12:00.857 } 00:12:00.857 ] 00:12:00.857 }' 00:12:00.857 13:20:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.857 13:20:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:01.426 [2024-07-25 13:20:42.189178] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:01.426 [2024-07-25 13:20:42.189206] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x211dda0 00:12:01.426 [2024-07-25 13:20:42.189210] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:12:01.426 [2024-07-25 13:20:42.189361] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x22c1870 00:12:01.426 [2024-07-25 13:20:42.189450] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x211dda0 00:12:01.426 [2024-07-25 13:20:42.189456] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x211dda0 00:12:01.426 [2024-07-25 13:20:42.189589] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:01.426 BaseBdev2 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:01.426 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:01.686 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:01.945 [ 00:12:01.945 { 00:12:01.945 "name": "BaseBdev2", 00:12:01.945 "aliases": [ 00:12:01.945 "ae5ef550-32dc-4a9a-b224-d7048e03adf9" 00:12:01.945 ], 00:12:01.945 "product_name": "Malloc disk", 00:12:01.945 "block_size": 512, 00:12:01.945 "num_blocks": 65536, 00:12:01.945 "uuid": "ae5ef550-32dc-4a9a-b224-d7048e03adf9", 00:12:01.945 "assigned_rate_limits": { 00:12:01.945 "rw_ios_per_sec": 0, 00:12:01.945 "rw_mbytes_per_sec": 0, 00:12:01.945 "r_mbytes_per_sec": 0, 00:12:01.945 "w_mbytes_per_sec": 0 00:12:01.945 }, 00:12:01.945 "claimed": true, 00:12:01.945 "claim_type": "exclusive_write", 00:12:01.945 "zoned": false, 00:12:01.945 "supported_io_types": { 00:12:01.945 "read": true, 00:12:01.945 "write": true, 00:12:01.945 "unmap": true, 00:12:01.945 "flush": true, 00:12:01.945 "reset": true, 00:12:01.945 "nvme_admin": false, 00:12:01.945 "nvme_io": false, 00:12:01.945 "nvme_io_md": false, 00:12:01.945 "write_zeroes": true, 00:12:01.945 "zcopy": true, 00:12:01.945 "get_zone_info": false, 00:12:01.945 "zone_management": false, 00:12:01.945 "zone_append": false, 00:12:01.945 "compare": false, 00:12:01.945 "compare_and_write": false, 00:12:01.945 "abort": true, 00:12:01.945 "seek_hole": false, 00:12:01.945 "seek_data": false, 00:12:01.945 "copy": true, 00:12:01.945 "nvme_iov_md": false 00:12:01.945 }, 00:12:01.945 "memory_domains": [ 00:12:01.945 { 00:12:01.945 "dma_device_id": "system", 00:12:01.945 "dma_device_type": 1 00:12:01.945 }, 00:12:01.945 { 00:12:01.945 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.945 "dma_device_type": 2 00:12:01.945 } 00:12:01.945 ], 00:12:01.945 "driver_specific": {} 00:12:01.945 } 00:12:01.945 ] 00:12:01.945 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:01.945 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:01.945 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:01.945 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:01.945 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:01.945 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.946 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.205 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.205 "name": "Existed_Raid", 00:12:02.205 "uuid": "cffd2f0c-8fe6-44f4-b856-193e16aa4f74", 00:12:02.205 "strip_size_kb": 64, 00:12:02.205 "state": "online", 00:12:02.205 "raid_level": "concat", 00:12:02.205 "superblock": false, 00:12:02.205 "num_base_bdevs": 2, 00:12:02.205 "num_base_bdevs_discovered": 2, 00:12:02.205 "num_base_bdevs_operational": 2, 00:12:02.205 "base_bdevs_list": [ 00:12:02.205 { 00:12:02.205 "name": "BaseBdev1", 00:12:02.205 "uuid": "e6596c37-e4c0-4289-baa9-ff6e630d9849", 00:12:02.205 "is_configured": true, 00:12:02.205 "data_offset": 0, 00:12:02.205 "data_size": 65536 00:12:02.205 }, 00:12:02.205 { 00:12:02.205 "name": "BaseBdev2", 00:12:02.205 "uuid": "ae5ef550-32dc-4a9a-b224-d7048e03adf9", 00:12:02.205 "is_configured": true, 00:12:02.205 "data_offset": 0, 00:12:02.205 "data_size": 65536 00:12:02.205 } 00:12:02.205 ] 00:12:02.205 }' 00:12:02.205 13:20:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.205 13:20:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:02.779 [2024-07-25 13:20:43.536808] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:02.779 "name": "Existed_Raid", 00:12:02.779 "aliases": [ 00:12:02.779 "cffd2f0c-8fe6-44f4-b856-193e16aa4f74" 00:12:02.779 ], 00:12:02.779 "product_name": "Raid Volume", 00:12:02.779 "block_size": 512, 00:12:02.779 "num_blocks": 131072, 00:12:02.779 "uuid": "cffd2f0c-8fe6-44f4-b856-193e16aa4f74", 00:12:02.779 "assigned_rate_limits": { 00:12:02.779 "rw_ios_per_sec": 0, 00:12:02.779 "rw_mbytes_per_sec": 0, 00:12:02.779 "r_mbytes_per_sec": 0, 00:12:02.779 "w_mbytes_per_sec": 0 00:12:02.779 }, 00:12:02.779 "claimed": false, 00:12:02.779 "zoned": false, 00:12:02.779 "supported_io_types": { 00:12:02.779 "read": true, 00:12:02.779 "write": true, 00:12:02.779 "unmap": true, 00:12:02.779 "flush": true, 00:12:02.779 "reset": true, 00:12:02.779 "nvme_admin": false, 00:12:02.779 "nvme_io": false, 00:12:02.779 "nvme_io_md": false, 00:12:02.779 "write_zeroes": true, 00:12:02.779 "zcopy": false, 00:12:02.779 "get_zone_info": false, 00:12:02.779 "zone_management": false, 00:12:02.779 "zone_append": false, 00:12:02.779 "compare": false, 00:12:02.779 "compare_and_write": false, 00:12:02.779 "abort": false, 00:12:02.779 "seek_hole": false, 00:12:02.779 "seek_data": false, 00:12:02.779 "copy": false, 00:12:02.779 "nvme_iov_md": false 00:12:02.779 }, 00:12:02.779 "memory_domains": [ 00:12:02.779 { 00:12:02.779 "dma_device_id": "system", 00:12:02.779 "dma_device_type": 1 00:12:02.779 }, 00:12:02.779 { 00:12:02.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.779 "dma_device_type": 2 00:12:02.779 }, 00:12:02.779 { 00:12:02.779 "dma_device_id": "system", 00:12:02.779 "dma_device_type": 1 00:12:02.779 }, 00:12:02.779 { 00:12:02.779 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.779 "dma_device_type": 2 00:12:02.779 } 00:12:02.779 ], 00:12:02.779 "driver_specific": { 00:12:02.779 "raid": { 00:12:02.779 "uuid": "cffd2f0c-8fe6-44f4-b856-193e16aa4f74", 00:12:02.779 "strip_size_kb": 64, 00:12:02.779 "state": "online", 00:12:02.779 "raid_level": "concat", 00:12:02.779 "superblock": false, 00:12:02.779 "num_base_bdevs": 2, 00:12:02.779 "num_base_bdevs_discovered": 2, 00:12:02.779 "num_base_bdevs_operational": 2, 00:12:02.779 "base_bdevs_list": [ 00:12:02.779 { 00:12:02.779 "name": "BaseBdev1", 00:12:02.779 "uuid": "e6596c37-e4c0-4289-baa9-ff6e630d9849", 00:12:02.779 "is_configured": true, 00:12:02.779 "data_offset": 0, 00:12:02.779 "data_size": 65536 00:12:02.779 }, 00:12:02.779 { 00:12:02.779 "name": "BaseBdev2", 00:12:02.779 "uuid": "ae5ef550-32dc-4a9a-b224-d7048e03adf9", 00:12:02.779 "is_configured": true, 00:12:02.779 "data_offset": 0, 00:12:02.779 "data_size": 65536 00:12:02.779 } 00:12:02.779 ] 00:12:02.779 } 00:12:02.779 } 00:12:02.779 }' 00:12:02.779 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:03.049 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:03.049 BaseBdev2' 00:12:03.049 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.049 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:03.049 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.049 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.049 "name": "BaseBdev1", 00:12:03.049 "aliases": [ 00:12:03.049 "e6596c37-e4c0-4289-baa9-ff6e630d9849" 00:12:03.049 ], 00:12:03.049 "product_name": "Malloc disk", 00:12:03.049 "block_size": 512, 00:12:03.049 "num_blocks": 65536, 00:12:03.049 "uuid": "e6596c37-e4c0-4289-baa9-ff6e630d9849", 00:12:03.049 "assigned_rate_limits": { 00:12:03.049 "rw_ios_per_sec": 0, 00:12:03.049 "rw_mbytes_per_sec": 0, 00:12:03.049 "r_mbytes_per_sec": 0, 00:12:03.049 "w_mbytes_per_sec": 0 00:12:03.049 }, 00:12:03.049 "claimed": true, 00:12:03.049 "claim_type": "exclusive_write", 00:12:03.049 "zoned": false, 00:12:03.049 "supported_io_types": { 00:12:03.049 "read": true, 00:12:03.049 "write": true, 00:12:03.049 "unmap": true, 00:12:03.049 "flush": true, 00:12:03.049 "reset": true, 00:12:03.049 "nvme_admin": false, 00:12:03.049 "nvme_io": false, 00:12:03.049 "nvme_io_md": false, 00:12:03.049 "write_zeroes": true, 00:12:03.049 "zcopy": true, 00:12:03.049 "get_zone_info": false, 00:12:03.049 "zone_management": false, 00:12:03.049 "zone_append": false, 00:12:03.049 "compare": false, 00:12:03.049 "compare_and_write": false, 00:12:03.049 "abort": true, 00:12:03.049 "seek_hole": false, 00:12:03.049 "seek_data": false, 00:12:03.049 "copy": true, 00:12:03.049 "nvme_iov_md": false 00:12:03.049 }, 00:12:03.049 "memory_domains": [ 00:12:03.049 { 00:12:03.049 "dma_device_id": "system", 00:12:03.049 "dma_device_type": 1 00:12:03.049 }, 00:12:03.049 { 00:12:03.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.049 "dma_device_type": 2 00:12:03.049 } 00:12:03.049 ], 00:12:03.049 "driver_specific": {} 00:12:03.049 }' 00:12:03.049 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.049 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.308 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.308 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.308 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.308 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.308 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.308 13:20:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.308 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.309 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.309 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.309 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.309 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.569 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:03.569 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.569 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.569 "name": "BaseBdev2", 00:12:03.569 "aliases": [ 00:12:03.569 "ae5ef550-32dc-4a9a-b224-d7048e03adf9" 00:12:03.569 ], 00:12:03.569 "product_name": "Malloc disk", 00:12:03.569 "block_size": 512, 00:12:03.569 "num_blocks": 65536, 00:12:03.569 "uuid": "ae5ef550-32dc-4a9a-b224-d7048e03adf9", 00:12:03.569 "assigned_rate_limits": { 00:12:03.569 "rw_ios_per_sec": 0, 00:12:03.569 "rw_mbytes_per_sec": 0, 00:12:03.569 "r_mbytes_per_sec": 0, 00:12:03.569 "w_mbytes_per_sec": 0 00:12:03.569 }, 00:12:03.569 "claimed": true, 00:12:03.569 "claim_type": "exclusive_write", 00:12:03.569 "zoned": false, 00:12:03.569 "supported_io_types": { 00:12:03.569 "read": true, 00:12:03.569 "write": true, 00:12:03.569 "unmap": true, 00:12:03.569 "flush": true, 00:12:03.569 "reset": true, 00:12:03.569 "nvme_admin": false, 00:12:03.569 "nvme_io": false, 00:12:03.569 "nvme_io_md": false, 00:12:03.569 "write_zeroes": true, 00:12:03.569 "zcopy": true, 00:12:03.569 "get_zone_info": false, 00:12:03.569 "zone_management": false, 00:12:03.569 "zone_append": false, 00:12:03.569 "compare": false, 00:12:03.569 "compare_and_write": false, 00:12:03.569 "abort": true, 00:12:03.569 "seek_hole": false, 00:12:03.569 "seek_data": false, 00:12:03.569 "copy": true, 00:12:03.569 "nvme_iov_md": false 00:12:03.569 }, 00:12:03.569 "memory_domains": [ 00:12:03.569 { 00:12:03.569 "dma_device_id": "system", 00:12:03.569 "dma_device_type": 1 00:12:03.569 }, 00:12:03.569 { 00:12:03.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.569 "dma_device_type": 2 00:12:03.569 } 00:12:03.569 ], 00:12:03.569 "driver_specific": {} 00:12:03.569 }' 00:12:03.569 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.569 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.829 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:04.089 [2024-07-25 13:20:44.783778] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:04.089 [2024-07-25 13:20:44.783799] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:04.089 [2024-07-25 13:20:44.783831] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:04.089 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:04.090 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.090 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.090 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.090 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.090 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.090 13:20:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:04.350 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:04.350 "name": "Existed_Raid", 00:12:04.350 "uuid": "cffd2f0c-8fe6-44f4-b856-193e16aa4f74", 00:12:04.350 "strip_size_kb": 64, 00:12:04.350 "state": "offline", 00:12:04.350 "raid_level": "concat", 00:12:04.350 "superblock": false, 00:12:04.350 "num_base_bdevs": 2, 00:12:04.350 "num_base_bdevs_discovered": 1, 00:12:04.350 "num_base_bdevs_operational": 1, 00:12:04.350 "base_bdevs_list": [ 00:12:04.350 { 00:12:04.350 "name": null, 00:12:04.350 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:04.350 "is_configured": false, 00:12:04.350 "data_offset": 0, 00:12:04.350 "data_size": 65536 00:12:04.350 }, 00:12:04.350 { 00:12:04.350 "name": "BaseBdev2", 00:12:04.350 "uuid": "ae5ef550-32dc-4a9a-b224-d7048e03adf9", 00:12:04.350 "is_configured": true, 00:12:04.350 "data_offset": 0, 00:12:04.350 "data_size": 65536 00:12:04.350 } 00:12:04.350 ] 00:12:04.350 }' 00:12:04.350 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:04.350 13:20:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:04.922 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:04.922 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:04.922 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.922 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:05.183 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:05.183 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:05.183 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:05.183 [2024-07-25 13:20:45.914683] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:05.183 [2024-07-25 13:20:45.914720] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x211dda0 name Existed_Raid, state offline 00:12:05.183 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:05.183 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:05.183 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.183 13:20:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 878686 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 878686 ']' 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 878686 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 878686 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 878686' 00:12:05.443 killing process with pid 878686 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 878686 00:12:05.443 [2024-07-25 13:20:46.178763] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:05.443 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 878686 00:12:05.443 [2024-07-25 13:20:46.179361] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:05.704 13:20:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:05.704 00:12:05.704 real 0m8.881s 00:12:05.704 user 0m16.070s 00:12:05.704 sys 0m1.423s 00:12:05.704 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:05.704 13:20:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.704 ************************************ 00:12:05.704 END TEST raid_state_function_test 00:12:05.705 ************************************ 00:12:05.705 13:20:46 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:12:05.705 13:20:46 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:05.705 13:20:46 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:05.705 13:20:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:05.705 ************************************ 00:12:05.705 START TEST raid_state_function_test_sb 00:12:05.705 ************************************ 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 2 true 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=880428 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 880428' 00:12:05.705 Process raid pid: 880428 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 880428 /var/tmp/spdk-raid.sock 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 880428 ']' 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:05.705 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:05.705 13:20:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:05.705 [2024-07-25 13:20:46.439592] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:12:05.705 [2024-07-25 13:20:46.439642] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:05.967 [2024-07-25 13:20:46.528648] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:05.967 [2024-07-25 13:20:46.593048] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:05.967 [2024-07-25 13:20:46.632597] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:05.967 [2024-07-25 13:20:46.632620] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:06.539 13:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:06.539 13:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:12:06.539 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:06.801 [2024-07-25 13:20:47.447791] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:06.801 [2024-07-25 13:20:47.447824] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:06.801 [2024-07-25 13:20:47.447830] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:06.801 [2024-07-25 13:20:47.447836] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.801 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:07.061 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:07.061 "name": "Existed_Raid", 00:12:07.061 "uuid": "88f5d723-75e6-4731-93fe-e11940660650", 00:12:07.062 "strip_size_kb": 64, 00:12:07.062 "state": "configuring", 00:12:07.062 "raid_level": "concat", 00:12:07.062 "superblock": true, 00:12:07.062 "num_base_bdevs": 2, 00:12:07.062 "num_base_bdevs_discovered": 0, 00:12:07.062 "num_base_bdevs_operational": 2, 00:12:07.062 "base_bdevs_list": [ 00:12:07.062 { 00:12:07.062 "name": "BaseBdev1", 00:12:07.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.062 "is_configured": false, 00:12:07.062 "data_offset": 0, 00:12:07.062 "data_size": 0 00:12:07.062 }, 00:12:07.062 { 00:12:07.062 "name": "BaseBdev2", 00:12:07.062 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:07.062 "is_configured": false, 00:12:07.062 "data_offset": 0, 00:12:07.062 "data_size": 0 00:12:07.062 } 00:12:07.062 ] 00:12:07.062 }' 00:12:07.062 13:20:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:07.062 13:20:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:07.631 13:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:07.631 [2024-07-25 13:20:48.382084] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:07.631 [2024-07-25 13:20:48.382108] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17526b0 name Existed_Raid, state configuring 00:12:07.631 13:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:07.912 [2024-07-25 13:20:48.570577] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:07.912 [2024-07-25 13:20:48.570599] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:07.912 [2024-07-25 13:20:48.570604] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:07.912 [2024-07-25 13:20:48.570610] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:07.912 13:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:08.226 [2024-07-25 13:20:48.765609] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:08.226 BaseBdev1 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:08.226 13:20:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:08.487 [ 00:12:08.487 { 00:12:08.487 "name": "BaseBdev1", 00:12:08.487 "aliases": [ 00:12:08.487 "0ae6fb27-d3cd-404b-b0d0-9e47142aa308" 00:12:08.487 ], 00:12:08.487 "product_name": "Malloc disk", 00:12:08.487 "block_size": 512, 00:12:08.487 "num_blocks": 65536, 00:12:08.487 "uuid": "0ae6fb27-d3cd-404b-b0d0-9e47142aa308", 00:12:08.487 "assigned_rate_limits": { 00:12:08.487 "rw_ios_per_sec": 0, 00:12:08.487 "rw_mbytes_per_sec": 0, 00:12:08.487 "r_mbytes_per_sec": 0, 00:12:08.487 "w_mbytes_per_sec": 0 00:12:08.487 }, 00:12:08.487 "claimed": true, 00:12:08.487 "claim_type": "exclusive_write", 00:12:08.487 "zoned": false, 00:12:08.487 "supported_io_types": { 00:12:08.487 "read": true, 00:12:08.487 "write": true, 00:12:08.487 "unmap": true, 00:12:08.487 "flush": true, 00:12:08.487 "reset": true, 00:12:08.487 "nvme_admin": false, 00:12:08.487 "nvme_io": false, 00:12:08.487 "nvme_io_md": false, 00:12:08.487 "write_zeroes": true, 00:12:08.487 "zcopy": true, 00:12:08.487 "get_zone_info": false, 00:12:08.487 "zone_management": false, 00:12:08.487 "zone_append": false, 00:12:08.487 "compare": false, 00:12:08.487 "compare_and_write": false, 00:12:08.487 "abort": true, 00:12:08.487 "seek_hole": false, 00:12:08.487 "seek_data": false, 00:12:08.487 "copy": true, 00:12:08.487 "nvme_iov_md": false 00:12:08.487 }, 00:12:08.487 "memory_domains": [ 00:12:08.487 { 00:12:08.487 "dma_device_id": "system", 00:12:08.487 "dma_device_type": 1 00:12:08.487 }, 00:12:08.487 { 00:12:08.487 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.487 "dma_device_type": 2 00:12:08.487 } 00:12:08.487 ], 00:12:08.487 "driver_specific": {} 00:12:08.487 } 00:12:08.487 ] 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.487 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:08.747 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:08.747 "name": "Existed_Raid", 00:12:08.747 "uuid": "e9b4fec4-49d3-4fa7-a807-dcbfd73e5a72", 00:12:08.747 "strip_size_kb": 64, 00:12:08.747 "state": "configuring", 00:12:08.747 "raid_level": "concat", 00:12:08.747 "superblock": true, 00:12:08.747 "num_base_bdevs": 2, 00:12:08.747 "num_base_bdevs_discovered": 1, 00:12:08.747 "num_base_bdevs_operational": 2, 00:12:08.747 "base_bdevs_list": [ 00:12:08.747 { 00:12:08.747 "name": "BaseBdev1", 00:12:08.747 "uuid": "0ae6fb27-d3cd-404b-b0d0-9e47142aa308", 00:12:08.747 "is_configured": true, 00:12:08.747 "data_offset": 2048, 00:12:08.747 "data_size": 63488 00:12:08.747 }, 00:12:08.747 { 00:12:08.747 "name": "BaseBdev2", 00:12:08.747 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:08.747 "is_configured": false, 00:12:08.747 "data_offset": 0, 00:12:08.747 "data_size": 0 00:12:08.747 } 00:12:08.747 ] 00:12:08.747 }' 00:12:08.747 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:08.747 13:20:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:09.316 13:20:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:09.316 [2024-07-25 13:20:50.088969] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:09.317 [2024-07-25 13:20:50.089000] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1751fa0 name Existed_Raid, state configuring 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:09.577 [2024-07-25 13:20:50.285501] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:09.577 [2024-07-25 13:20:50.286719] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:09.577 [2024-07-25 13:20:50.286744] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:09.577 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.837 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.837 "name": "Existed_Raid", 00:12:09.837 "uuid": "c75a3144-a4a8-41f7-99cb-491b4137d6f0", 00:12:09.837 "strip_size_kb": 64, 00:12:09.837 "state": "configuring", 00:12:09.837 "raid_level": "concat", 00:12:09.837 "superblock": true, 00:12:09.837 "num_base_bdevs": 2, 00:12:09.837 "num_base_bdevs_discovered": 1, 00:12:09.837 "num_base_bdevs_operational": 2, 00:12:09.837 "base_bdevs_list": [ 00:12:09.837 { 00:12:09.837 "name": "BaseBdev1", 00:12:09.837 "uuid": "0ae6fb27-d3cd-404b-b0d0-9e47142aa308", 00:12:09.837 "is_configured": true, 00:12:09.837 "data_offset": 2048, 00:12:09.837 "data_size": 63488 00:12:09.837 }, 00:12:09.837 { 00:12:09.837 "name": "BaseBdev2", 00:12:09.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.837 "is_configured": false, 00:12:09.837 "data_offset": 0, 00:12:09.837 "data_size": 0 00:12:09.837 } 00:12:09.837 ] 00:12:09.837 }' 00:12:09.837 13:20:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.837 13:20:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:10.407 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:10.668 [2024-07-25 13:20:51.228862] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:10.668 [2024-07-25 13:20:51.228973] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1752da0 00:12:10.668 [2024-07-25 13:20:51.228982] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:10.668 [2024-07-25 13:20:51.229118] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17519b0 00:12:10.668 [2024-07-25 13:20:51.229204] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1752da0 00:12:10.668 [2024-07-25 13:20:51.229210] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1752da0 00:12:10.668 [2024-07-25 13:20:51.229277] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:10.668 BaseBdev2 00:12:10.668 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:10.668 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:10.668 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:10.668 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:10.668 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:10.668 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:10.668 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:10.928 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:10.928 [ 00:12:10.928 { 00:12:10.928 "name": "BaseBdev2", 00:12:10.928 "aliases": [ 00:12:10.928 "b5f74d69-c520-4e5d-ad0c-ddc21ebacf35" 00:12:10.928 ], 00:12:10.928 "product_name": "Malloc disk", 00:12:10.929 "block_size": 512, 00:12:10.929 "num_blocks": 65536, 00:12:10.929 "uuid": "b5f74d69-c520-4e5d-ad0c-ddc21ebacf35", 00:12:10.929 "assigned_rate_limits": { 00:12:10.929 "rw_ios_per_sec": 0, 00:12:10.929 "rw_mbytes_per_sec": 0, 00:12:10.929 "r_mbytes_per_sec": 0, 00:12:10.929 "w_mbytes_per_sec": 0 00:12:10.929 }, 00:12:10.929 "claimed": true, 00:12:10.929 "claim_type": "exclusive_write", 00:12:10.929 "zoned": false, 00:12:10.929 "supported_io_types": { 00:12:10.929 "read": true, 00:12:10.929 "write": true, 00:12:10.929 "unmap": true, 00:12:10.929 "flush": true, 00:12:10.929 "reset": true, 00:12:10.929 "nvme_admin": false, 00:12:10.929 "nvme_io": false, 00:12:10.929 "nvme_io_md": false, 00:12:10.929 "write_zeroes": true, 00:12:10.929 "zcopy": true, 00:12:10.929 "get_zone_info": false, 00:12:10.929 "zone_management": false, 00:12:10.929 "zone_append": false, 00:12:10.929 "compare": false, 00:12:10.929 "compare_and_write": false, 00:12:10.929 "abort": true, 00:12:10.929 "seek_hole": false, 00:12:10.929 "seek_data": false, 00:12:10.929 "copy": true, 00:12:10.929 "nvme_iov_md": false 00:12:10.929 }, 00:12:10.929 "memory_domains": [ 00:12:10.929 { 00:12:10.929 "dma_device_id": "system", 00:12:10.929 "dma_device_type": 1 00:12:10.929 }, 00:12:10.929 { 00:12:10.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:10.929 "dma_device_type": 2 00:12:10.929 } 00:12:10.929 ], 00:12:10.929 "driver_specific": {} 00:12:10.929 } 00:12:10.929 ] 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.929 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:11.189 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:11.189 "name": "Existed_Raid", 00:12:11.189 "uuid": "c75a3144-a4a8-41f7-99cb-491b4137d6f0", 00:12:11.189 "strip_size_kb": 64, 00:12:11.189 "state": "online", 00:12:11.189 "raid_level": "concat", 00:12:11.189 "superblock": true, 00:12:11.189 "num_base_bdevs": 2, 00:12:11.189 "num_base_bdevs_discovered": 2, 00:12:11.189 "num_base_bdevs_operational": 2, 00:12:11.189 "base_bdevs_list": [ 00:12:11.189 { 00:12:11.189 "name": "BaseBdev1", 00:12:11.189 "uuid": "0ae6fb27-d3cd-404b-b0d0-9e47142aa308", 00:12:11.189 "is_configured": true, 00:12:11.189 "data_offset": 2048, 00:12:11.189 "data_size": 63488 00:12:11.189 }, 00:12:11.189 { 00:12:11.189 "name": "BaseBdev2", 00:12:11.189 "uuid": "b5f74d69-c520-4e5d-ad0c-ddc21ebacf35", 00:12:11.189 "is_configured": true, 00:12:11.189 "data_offset": 2048, 00:12:11.189 "data_size": 63488 00:12:11.189 } 00:12:11.189 ] 00:12:11.189 }' 00:12:11.189 13:20:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:11.189 13:20:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:11.758 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:12.019 [2024-07-25 13:20:52.693893] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:12.019 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:12.019 "name": "Existed_Raid", 00:12:12.019 "aliases": [ 00:12:12.019 "c75a3144-a4a8-41f7-99cb-491b4137d6f0" 00:12:12.019 ], 00:12:12.019 "product_name": "Raid Volume", 00:12:12.019 "block_size": 512, 00:12:12.019 "num_blocks": 126976, 00:12:12.019 "uuid": "c75a3144-a4a8-41f7-99cb-491b4137d6f0", 00:12:12.019 "assigned_rate_limits": { 00:12:12.019 "rw_ios_per_sec": 0, 00:12:12.019 "rw_mbytes_per_sec": 0, 00:12:12.019 "r_mbytes_per_sec": 0, 00:12:12.019 "w_mbytes_per_sec": 0 00:12:12.019 }, 00:12:12.019 "claimed": false, 00:12:12.019 "zoned": false, 00:12:12.019 "supported_io_types": { 00:12:12.019 "read": true, 00:12:12.019 "write": true, 00:12:12.019 "unmap": true, 00:12:12.019 "flush": true, 00:12:12.019 "reset": true, 00:12:12.019 "nvme_admin": false, 00:12:12.019 "nvme_io": false, 00:12:12.019 "nvme_io_md": false, 00:12:12.019 "write_zeroes": true, 00:12:12.019 "zcopy": false, 00:12:12.019 "get_zone_info": false, 00:12:12.019 "zone_management": false, 00:12:12.019 "zone_append": false, 00:12:12.019 "compare": false, 00:12:12.019 "compare_and_write": false, 00:12:12.019 "abort": false, 00:12:12.019 "seek_hole": false, 00:12:12.019 "seek_data": false, 00:12:12.019 "copy": false, 00:12:12.019 "nvme_iov_md": false 00:12:12.019 }, 00:12:12.019 "memory_domains": [ 00:12:12.019 { 00:12:12.019 "dma_device_id": "system", 00:12:12.019 "dma_device_type": 1 00:12:12.019 }, 00:12:12.019 { 00:12:12.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.019 "dma_device_type": 2 00:12:12.019 }, 00:12:12.019 { 00:12:12.019 "dma_device_id": "system", 00:12:12.019 "dma_device_type": 1 00:12:12.019 }, 00:12:12.019 { 00:12:12.019 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.019 "dma_device_type": 2 00:12:12.019 } 00:12:12.019 ], 00:12:12.019 "driver_specific": { 00:12:12.019 "raid": { 00:12:12.019 "uuid": "c75a3144-a4a8-41f7-99cb-491b4137d6f0", 00:12:12.019 "strip_size_kb": 64, 00:12:12.019 "state": "online", 00:12:12.019 "raid_level": "concat", 00:12:12.019 "superblock": true, 00:12:12.019 "num_base_bdevs": 2, 00:12:12.019 "num_base_bdevs_discovered": 2, 00:12:12.019 "num_base_bdevs_operational": 2, 00:12:12.019 "base_bdevs_list": [ 00:12:12.019 { 00:12:12.019 "name": "BaseBdev1", 00:12:12.019 "uuid": "0ae6fb27-d3cd-404b-b0d0-9e47142aa308", 00:12:12.019 "is_configured": true, 00:12:12.019 "data_offset": 2048, 00:12:12.019 "data_size": 63488 00:12:12.019 }, 00:12:12.019 { 00:12:12.019 "name": "BaseBdev2", 00:12:12.019 "uuid": "b5f74d69-c520-4e5d-ad0c-ddc21ebacf35", 00:12:12.019 "is_configured": true, 00:12:12.019 "data_offset": 2048, 00:12:12.019 "data_size": 63488 00:12:12.019 } 00:12:12.019 ] 00:12:12.019 } 00:12:12.019 } 00:12:12.019 }' 00:12:12.019 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:12.019 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:12.019 BaseBdev2' 00:12:12.019 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:12.019 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:12.019 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:12.280 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:12.280 "name": "BaseBdev1", 00:12:12.280 "aliases": [ 00:12:12.280 "0ae6fb27-d3cd-404b-b0d0-9e47142aa308" 00:12:12.280 ], 00:12:12.280 "product_name": "Malloc disk", 00:12:12.280 "block_size": 512, 00:12:12.280 "num_blocks": 65536, 00:12:12.280 "uuid": "0ae6fb27-d3cd-404b-b0d0-9e47142aa308", 00:12:12.280 "assigned_rate_limits": { 00:12:12.280 "rw_ios_per_sec": 0, 00:12:12.280 "rw_mbytes_per_sec": 0, 00:12:12.280 "r_mbytes_per_sec": 0, 00:12:12.280 "w_mbytes_per_sec": 0 00:12:12.280 }, 00:12:12.280 "claimed": true, 00:12:12.280 "claim_type": "exclusive_write", 00:12:12.280 "zoned": false, 00:12:12.280 "supported_io_types": { 00:12:12.280 "read": true, 00:12:12.280 "write": true, 00:12:12.280 "unmap": true, 00:12:12.280 "flush": true, 00:12:12.280 "reset": true, 00:12:12.280 "nvme_admin": false, 00:12:12.280 "nvme_io": false, 00:12:12.280 "nvme_io_md": false, 00:12:12.280 "write_zeroes": true, 00:12:12.280 "zcopy": true, 00:12:12.280 "get_zone_info": false, 00:12:12.280 "zone_management": false, 00:12:12.280 "zone_append": false, 00:12:12.280 "compare": false, 00:12:12.280 "compare_and_write": false, 00:12:12.280 "abort": true, 00:12:12.280 "seek_hole": false, 00:12:12.280 "seek_data": false, 00:12:12.280 "copy": true, 00:12:12.280 "nvme_iov_md": false 00:12:12.280 }, 00:12:12.280 "memory_domains": [ 00:12:12.280 { 00:12:12.280 "dma_device_id": "system", 00:12:12.280 "dma_device_type": 1 00:12:12.280 }, 00:12:12.280 { 00:12:12.280 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.280 "dma_device_type": 2 00:12:12.280 } 00:12:12.280 ], 00:12:12.280 "driver_specific": {} 00:12:12.280 }' 00:12:12.280 13:20:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:12.280 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:12.280 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:12.280 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:12.540 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:12.540 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:12.540 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:12.540 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:12.540 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:12.540 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:12.540 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:12.800 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:12.800 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:12.800 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:12.800 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:12.800 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:12.800 "name": "BaseBdev2", 00:12:12.800 "aliases": [ 00:12:12.800 "b5f74d69-c520-4e5d-ad0c-ddc21ebacf35" 00:12:12.800 ], 00:12:12.800 "product_name": "Malloc disk", 00:12:12.800 "block_size": 512, 00:12:12.800 "num_blocks": 65536, 00:12:12.800 "uuid": "b5f74d69-c520-4e5d-ad0c-ddc21ebacf35", 00:12:12.800 "assigned_rate_limits": { 00:12:12.800 "rw_ios_per_sec": 0, 00:12:12.800 "rw_mbytes_per_sec": 0, 00:12:12.800 "r_mbytes_per_sec": 0, 00:12:12.800 "w_mbytes_per_sec": 0 00:12:12.800 }, 00:12:12.800 "claimed": true, 00:12:12.800 "claim_type": "exclusive_write", 00:12:12.800 "zoned": false, 00:12:12.800 "supported_io_types": { 00:12:12.800 "read": true, 00:12:12.800 "write": true, 00:12:12.800 "unmap": true, 00:12:12.800 "flush": true, 00:12:12.800 "reset": true, 00:12:12.800 "nvme_admin": false, 00:12:12.800 "nvme_io": false, 00:12:12.800 "nvme_io_md": false, 00:12:12.800 "write_zeroes": true, 00:12:12.800 "zcopy": true, 00:12:12.800 "get_zone_info": false, 00:12:12.800 "zone_management": false, 00:12:12.800 "zone_append": false, 00:12:12.800 "compare": false, 00:12:12.800 "compare_and_write": false, 00:12:12.800 "abort": true, 00:12:12.800 "seek_hole": false, 00:12:12.800 "seek_data": false, 00:12:12.800 "copy": true, 00:12:12.800 "nvme_iov_md": false 00:12:12.800 }, 00:12:12.800 "memory_domains": [ 00:12:12.800 { 00:12:12.800 "dma_device_id": "system", 00:12:12.800 "dma_device_type": 1 00:12:12.800 }, 00:12:12.800 { 00:12:12.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:12.800 "dma_device_type": 2 00:12:12.800 } 00:12:12.800 ], 00:12:12.800 "driver_specific": {} 00:12:12.800 }' 00:12:12.800 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:13.060 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:13.319 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:13.319 13:20:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:13.319 [2024-07-25 13:20:54.061356] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:13.319 [2024-07-25 13:20:54.061400] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:13.319 [2024-07-25 13:20:54.061484] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:13.319 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:13.320 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:13.320 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:13.320 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.320 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:13.579 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.579 "name": "Existed_Raid", 00:12:13.579 "uuid": "c75a3144-a4a8-41f7-99cb-491b4137d6f0", 00:12:13.579 "strip_size_kb": 64, 00:12:13.579 "state": "offline", 00:12:13.579 "raid_level": "concat", 00:12:13.579 "superblock": true, 00:12:13.579 "num_base_bdevs": 2, 00:12:13.579 "num_base_bdevs_discovered": 1, 00:12:13.579 "num_base_bdevs_operational": 1, 00:12:13.579 "base_bdevs_list": [ 00:12:13.579 { 00:12:13.579 "name": null, 00:12:13.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.579 "is_configured": false, 00:12:13.579 "data_offset": 2048, 00:12:13.579 "data_size": 63488 00:12:13.579 }, 00:12:13.579 { 00:12:13.579 "name": "BaseBdev2", 00:12:13.579 "uuid": "b5f74d69-c520-4e5d-ad0c-ddc21ebacf35", 00:12:13.579 "is_configured": true, 00:12:13.579 "data_offset": 2048, 00:12:13.579 "data_size": 63488 00:12:13.579 } 00:12:13.579 ] 00:12:13.579 }' 00:12:13.579 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.579 13:20:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:14.149 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:14.149 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:14.149 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.149 13:20:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:14.409 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:14.409 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:14.409 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:14.670 [2024-07-25 13:20:55.263277] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:14.670 [2024-07-25 13:20:55.263359] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1752da0 name Existed_Raid, state offline 00:12:14.670 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:14.670 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:14.670 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.670 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 880428 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 880428 ']' 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 880428 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:12:14.930 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:14.931 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 880428 00:12:14.931 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:14.931 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:14.931 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 880428' 00:12:14.931 killing process with pid 880428 00:12:14.931 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 880428 00:12:14.931 [2024-07-25 13:20:55.556566] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:14.931 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 880428 00:12:14.931 [2024-07-25 13:20:55.557939] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:15.191 13:20:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:15.191 00:12:15.191 real 0m9.477s 00:12:15.191 user 0m17.013s 00:12:15.191 sys 0m1.442s 00:12:15.191 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:15.191 13:20:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:15.191 ************************************ 00:12:15.191 END TEST raid_state_function_test_sb 00:12:15.191 ************************************ 00:12:15.191 13:20:55 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:12:15.191 13:20:55 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:12:15.191 13:20:55 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:15.191 13:20:55 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:15.191 ************************************ 00:12:15.191 START TEST raid_superblock_test 00:12:15.191 ************************************ 00:12:15.191 13:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 2 00:12:15.191 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:12:15.191 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:12:15.191 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:12:15.191 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:12:15.191 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=882194 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 882194 /var/tmp/spdk-raid.sock 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 882194 ']' 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:15.192 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:15.192 13:20:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.451 [2024-07-25 13:20:56.002396] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:12:15.451 [2024-07-25 13:20:56.002450] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid882194 ] 00:12:15.451 [2024-07-25 13:20:56.091176] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:15.451 [2024-07-25 13:20:56.156398] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:15.451 [2024-07-25 13:20:56.195541] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:15.451 [2024-07-25 13:20:56.195568] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:16.391 13:20:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:16.391 malloc1 00:12:16.391 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:16.651 [2024-07-25 13:20:57.201618] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:16.651 [2024-07-25 13:20:57.201651] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:16.651 [2024-07-25 13:20:57.201664] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11079b0 00:12:16.651 [2024-07-25 13:20:57.201670] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:16.651 [2024-07-25 13:20:57.202968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:16.651 [2024-07-25 13:20:57.202989] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:16.651 pt1 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:16.651 malloc2 00:12:16.651 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:16.911 [2024-07-25 13:20:57.572607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:16.911 [2024-07-25 13:20:57.572635] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:16.911 [2024-07-25 13:20:57.572644] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1108db0 00:12:16.911 [2024-07-25 13:20:57.572650] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:16.911 [2024-07-25 13:20:57.573857] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:16.911 [2024-07-25 13:20:57.573877] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:16.911 pt2 00:12:16.911 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:16.911 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:16.911 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:12:17.171 [2024-07-25 13:20:57.749064] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:17.171 [2024-07-25 13:20:57.750041] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:17.171 [2024-07-25 13:20:57.750137] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12ab6b0 00:12:17.171 [2024-07-25 13:20:57.750144] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:17.171 [2024-07-25 13:20:57.750292] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1100b70 00:12:17.171 [2024-07-25 13:20:57.750395] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12ab6b0 00:12:17.171 [2024-07-25 13:20:57.750401] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12ab6b0 00:12:17.171 [2024-07-25 13:20:57.750477] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:17.171 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.171 "name": "raid_bdev1", 00:12:17.171 "uuid": "ff4ac280-f4cc-4403-ab2f-8926a31f5949", 00:12:17.171 "strip_size_kb": 64, 00:12:17.171 "state": "online", 00:12:17.171 "raid_level": "concat", 00:12:17.171 "superblock": true, 00:12:17.171 "num_base_bdevs": 2, 00:12:17.171 "num_base_bdevs_discovered": 2, 00:12:17.171 "num_base_bdevs_operational": 2, 00:12:17.171 "base_bdevs_list": [ 00:12:17.171 { 00:12:17.171 "name": "pt1", 00:12:17.171 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:17.171 "is_configured": true, 00:12:17.171 "data_offset": 2048, 00:12:17.171 "data_size": 63488 00:12:17.171 }, 00:12:17.171 { 00:12:17.171 "name": "pt2", 00:12:17.171 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:17.171 "is_configured": true, 00:12:17.172 "data_offset": 2048, 00:12:17.172 "data_size": 63488 00:12:17.172 } 00:12:17.172 ] 00:12:17.172 }' 00:12:17.172 13:20:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.172 13:20:57 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:17.741 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:12:17.742 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:17.742 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:17.742 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:17.742 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:17.742 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:17.742 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:17.742 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:18.002 [2024-07-25 13:20:58.683597] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:18.002 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:18.002 "name": "raid_bdev1", 00:12:18.002 "aliases": [ 00:12:18.002 "ff4ac280-f4cc-4403-ab2f-8926a31f5949" 00:12:18.002 ], 00:12:18.002 "product_name": "Raid Volume", 00:12:18.002 "block_size": 512, 00:12:18.002 "num_blocks": 126976, 00:12:18.002 "uuid": "ff4ac280-f4cc-4403-ab2f-8926a31f5949", 00:12:18.002 "assigned_rate_limits": { 00:12:18.002 "rw_ios_per_sec": 0, 00:12:18.002 "rw_mbytes_per_sec": 0, 00:12:18.002 "r_mbytes_per_sec": 0, 00:12:18.002 "w_mbytes_per_sec": 0 00:12:18.002 }, 00:12:18.002 "claimed": false, 00:12:18.002 "zoned": false, 00:12:18.002 "supported_io_types": { 00:12:18.002 "read": true, 00:12:18.002 "write": true, 00:12:18.002 "unmap": true, 00:12:18.002 "flush": true, 00:12:18.002 "reset": true, 00:12:18.002 "nvme_admin": false, 00:12:18.002 "nvme_io": false, 00:12:18.002 "nvme_io_md": false, 00:12:18.002 "write_zeroes": true, 00:12:18.002 "zcopy": false, 00:12:18.002 "get_zone_info": false, 00:12:18.002 "zone_management": false, 00:12:18.002 "zone_append": false, 00:12:18.002 "compare": false, 00:12:18.002 "compare_and_write": false, 00:12:18.002 "abort": false, 00:12:18.002 "seek_hole": false, 00:12:18.002 "seek_data": false, 00:12:18.002 "copy": false, 00:12:18.002 "nvme_iov_md": false 00:12:18.002 }, 00:12:18.002 "memory_domains": [ 00:12:18.002 { 00:12:18.002 "dma_device_id": "system", 00:12:18.002 "dma_device_type": 1 00:12:18.002 }, 00:12:18.002 { 00:12:18.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.002 "dma_device_type": 2 00:12:18.002 }, 00:12:18.002 { 00:12:18.002 "dma_device_id": "system", 00:12:18.002 "dma_device_type": 1 00:12:18.002 }, 00:12:18.002 { 00:12:18.002 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.002 "dma_device_type": 2 00:12:18.002 } 00:12:18.002 ], 00:12:18.002 "driver_specific": { 00:12:18.002 "raid": { 00:12:18.002 "uuid": "ff4ac280-f4cc-4403-ab2f-8926a31f5949", 00:12:18.002 "strip_size_kb": 64, 00:12:18.002 "state": "online", 00:12:18.002 "raid_level": "concat", 00:12:18.002 "superblock": true, 00:12:18.002 "num_base_bdevs": 2, 00:12:18.002 "num_base_bdevs_discovered": 2, 00:12:18.002 "num_base_bdevs_operational": 2, 00:12:18.002 "base_bdevs_list": [ 00:12:18.002 { 00:12:18.002 "name": "pt1", 00:12:18.002 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:18.002 "is_configured": true, 00:12:18.002 "data_offset": 2048, 00:12:18.002 "data_size": 63488 00:12:18.002 }, 00:12:18.002 { 00:12:18.002 "name": "pt2", 00:12:18.002 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:18.002 "is_configured": true, 00:12:18.002 "data_offset": 2048, 00:12:18.002 "data_size": 63488 00:12:18.002 } 00:12:18.002 ] 00:12:18.002 } 00:12:18.002 } 00:12:18.002 }' 00:12:18.002 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:18.002 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:18.002 pt2' 00:12:18.002 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:18.002 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:18.002 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:18.262 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:18.262 "name": "pt1", 00:12:18.262 "aliases": [ 00:12:18.262 "00000000-0000-0000-0000-000000000001" 00:12:18.262 ], 00:12:18.262 "product_name": "passthru", 00:12:18.262 "block_size": 512, 00:12:18.262 "num_blocks": 65536, 00:12:18.262 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:18.262 "assigned_rate_limits": { 00:12:18.262 "rw_ios_per_sec": 0, 00:12:18.262 "rw_mbytes_per_sec": 0, 00:12:18.262 "r_mbytes_per_sec": 0, 00:12:18.262 "w_mbytes_per_sec": 0 00:12:18.262 }, 00:12:18.262 "claimed": true, 00:12:18.262 "claim_type": "exclusive_write", 00:12:18.262 "zoned": false, 00:12:18.262 "supported_io_types": { 00:12:18.262 "read": true, 00:12:18.262 "write": true, 00:12:18.262 "unmap": true, 00:12:18.262 "flush": true, 00:12:18.262 "reset": true, 00:12:18.262 "nvme_admin": false, 00:12:18.262 "nvme_io": false, 00:12:18.262 "nvme_io_md": false, 00:12:18.262 "write_zeroes": true, 00:12:18.262 "zcopy": true, 00:12:18.262 "get_zone_info": false, 00:12:18.262 "zone_management": false, 00:12:18.262 "zone_append": false, 00:12:18.262 "compare": false, 00:12:18.262 "compare_and_write": false, 00:12:18.262 "abort": true, 00:12:18.262 "seek_hole": false, 00:12:18.262 "seek_data": false, 00:12:18.262 "copy": true, 00:12:18.262 "nvme_iov_md": false 00:12:18.262 }, 00:12:18.262 "memory_domains": [ 00:12:18.262 { 00:12:18.262 "dma_device_id": "system", 00:12:18.262 "dma_device_type": 1 00:12:18.262 }, 00:12:18.262 { 00:12:18.262 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.262 "dma_device_type": 2 00:12:18.262 } 00:12:18.262 ], 00:12:18.262 "driver_specific": { 00:12:18.262 "passthru": { 00:12:18.262 "name": "pt1", 00:12:18.262 "base_bdev_name": "malloc1" 00:12:18.262 } 00:12:18.262 } 00:12:18.262 }' 00:12:18.262 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:18.262 13:20:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:18.262 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:18.262 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:18.522 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:18.522 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:18.522 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:18.523 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:18.783 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:18.783 "name": "pt2", 00:12:18.783 "aliases": [ 00:12:18.783 "00000000-0000-0000-0000-000000000002" 00:12:18.783 ], 00:12:18.783 "product_name": "passthru", 00:12:18.783 "block_size": 512, 00:12:18.783 "num_blocks": 65536, 00:12:18.783 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:18.783 "assigned_rate_limits": { 00:12:18.783 "rw_ios_per_sec": 0, 00:12:18.783 "rw_mbytes_per_sec": 0, 00:12:18.783 "r_mbytes_per_sec": 0, 00:12:18.783 "w_mbytes_per_sec": 0 00:12:18.783 }, 00:12:18.783 "claimed": true, 00:12:18.783 "claim_type": "exclusive_write", 00:12:18.783 "zoned": false, 00:12:18.783 "supported_io_types": { 00:12:18.783 "read": true, 00:12:18.783 "write": true, 00:12:18.783 "unmap": true, 00:12:18.783 "flush": true, 00:12:18.784 "reset": true, 00:12:18.784 "nvme_admin": false, 00:12:18.784 "nvme_io": false, 00:12:18.784 "nvme_io_md": false, 00:12:18.784 "write_zeroes": true, 00:12:18.784 "zcopy": true, 00:12:18.784 "get_zone_info": false, 00:12:18.784 "zone_management": false, 00:12:18.784 "zone_append": false, 00:12:18.784 "compare": false, 00:12:18.784 "compare_and_write": false, 00:12:18.784 "abort": true, 00:12:18.784 "seek_hole": false, 00:12:18.784 "seek_data": false, 00:12:18.784 "copy": true, 00:12:18.784 "nvme_iov_md": false 00:12:18.784 }, 00:12:18.784 "memory_domains": [ 00:12:18.784 { 00:12:18.784 "dma_device_id": "system", 00:12:18.784 "dma_device_type": 1 00:12:18.784 }, 00:12:18.784 { 00:12:18.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:18.784 "dma_device_type": 2 00:12:18.784 } 00:12:18.784 ], 00:12:18.784 "driver_specific": { 00:12:18.784 "passthru": { 00:12:18.784 "name": "pt2", 00:12:18.784 "base_bdev_name": "malloc2" 00:12:18.784 } 00:12:18.784 } 00:12:18.784 }' 00:12:18.784 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:18.784 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:18.784 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:18.784 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:19.044 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:12:19.304 [2024-07-25 13:20:59.966866] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:19.304 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ff4ac280-f4cc-4403-ab2f-8926a31f5949 00:12:19.304 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ff4ac280-f4cc-4403-ab2f-8926a31f5949 ']' 00:12:19.304 13:20:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:19.564 [2024-07-25 13:21:00.167185] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:19.564 [2024-07-25 13:21:00.167204] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:19.564 [2024-07-25 13:21:00.167245] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:19.564 [2024-07-25 13:21:00.167276] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:19.564 [2024-07-25 13:21:00.167283] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ab6b0 name raid_bdev1, state offline 00:12:19.564 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.564 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:12:19.827 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:12:19.827 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:12:19.827 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:19.827 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:19.827 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:19.827 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:20.087 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:20.087 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:20.347 13:21:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:12:20.347 [2024-07-25 13:21:01.133593] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:20.347 [2024-07-25 13:21:01.134657] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:20.347 [2024-07-25 13:21:01.134698] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:20.347 [2024-07-25 13:21:01.134726] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:20.347 [2024-07-25 13:21:01.134737] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:20.347 [2024-07-25 13:21:01.134743] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1107e50 name raid_bdev1, state configuring 00:12:20.347 request: 00:12:20.347 { 00:12:20.347 "name": "raid_bdev1", 00:12:20.347 "raid_level": "concat", 00:12:20.347 "base_bdevs": [ 00:12:20.347 "malloc1", 00:12:20.347 "malloc2" 00:12:20.347 ], 00:12:20.347 "strip_size_kb": 64, 00:12:20.347 "superblock": false, 00:12:20.347 "method": "bdev_raid_create", 00:12:20.347 "req_id": 1 00:12:20.347 } 00:12:20.347 Got JSON-RPC error response 00:12:20.347 response: 00:12:20.347 { 00:12:20.347 "code": -17, 00:12:20.347 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:20.347 } 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:12:20.608 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:20.868 [2024-07-25 13:21:01.518513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:20.868 [2024-07-25 13:21:01.518542] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:20.868 [2024-07-25 13:21:01.518565] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1107be0 00:12:20.868 [2024-07-25 13:21:01.518572] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:20.868 [2024-07-25 13:21:01.519824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:20.868 [2024-07-25 13:21:01.519844] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:20.868 [2024-07-25 13:21:01.519889] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:20.868 [2024-07-25 13:21:01.519906] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:20.868 pt1 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:20.868 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:21.128 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.128 "name": "raid_bdev1", 00:12:21.128 "uuid": "ff4ac280-f4cc-4403-ab2f-8926a31f5949", 00:12:21.128 "strip_size_kb": 64, 00:12:21.128 "state": "configuring", 00:12:21.128 "raid_level": "concat", 00:12:21.128 "superblock": true, 00:12:21.128 "num_base_bdevs": 2, 00:12:21.128 "num_base_bdevs_discovered": 1, 00:12:21.128 "num_base_bdevs_operational": 2, 00:12:21.128 "base_bdevs_list": [ 00:12:21.128 { 00:12:21.128 "name": "pt1", 00:12:21.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:21.128 "is_configured": true, 00:12:21.128 "data_offset": 2048, 00:12:21.128 "data_size": 63488 00:12:21.128 }, 00:12:21.128 { 00:12:21.128 "name": null, 00:12:21.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:21.128 "is_configured": false, 00:12:21.128 "data_offset": 2048, 00:12:21.128 "data_size": 63488 00:12:21.128 } 00:12:21.128 ] 00:12:21.128 }' 00:12:21.128 13:21:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.128 13:21:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:21.699 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:12:21.699 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:12:21.699 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:12:21.699 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:21.699 [2024-07-25 13:21:02.476952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:21.699 [2024-07-25 13:21:02.476986] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:21.699 [2024-07-25 13:21:02.476997] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x129f380 00:12:21.699 [2024-07-25 13:21:02.477003] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:21.699 [2024-07-25 13:21:02.477269] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:21.699 [2024-07-25 13:21:02.477281] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:21.699 [2024-07-25 13:21:02.477324] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:12:21.699 [2024-07-25 13:21:02.477341] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:21.699 [2024-07-25 13:21:02.477415] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12a0100 00:12:21.699 [2024-07-25 13:21:02.477421] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:21.699 [2024-07-25 13:21:02.477562] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a0d30 00:12:21.699 [2024-07-25 13:21:02.477657] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12a0100 00:12:21.699 [2024-07-25 13:21:02.477662] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12a0100 00:12:21.699 [2024-07-25 13:21:02.477733] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:21.699 pt2 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:21.959 "name": "raid_bdev1", 00:12:21.959 "uuid": "ff4ac280-f4cc-4403-ab2f-8926a31f5949", 00:12:21.959 "strip_size_kb": 64, 00:12:21.959 "state": "online", 00:12:21.959 "raid_level": "concat", 00:12:21.959 "superblock": true, 00:12:21.959 "num_base_bdevs": 2, 00:12:21.959 "num_base_bdevs_discovered": 2, 00:12:21.959 "num_base_bdevs_operational": 2, 00:12:21.959 "base_bdevs_list": [ 00:12:21.959 { 00:12:21.959 "name": "pt1", 00:12:21.959 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:21.959 "is_configured": true, 00:12:21.959 "data_offset": 2048, 00:12:21.959 "data_size": 63488 00:12:21.959 }, 00:12:21.959 { 00:12:21.959 "name": "pt2", 00:12:21.959 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:21.959 "is_configured": true, 00:12:21.959 "data_offset": 2048, 00:12:21.959 "data_size": 63488 00:12:21.959 } 00:12:21.959 ] 00:12:21.959 }' 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:21.959 13:21:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:22.529 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:22.790 [2024-07-25 13:21:03.379433] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:22.791 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:22.791 "name": "raid_bdev1", 00:12:22.791 "aliases": [ 00:12:22.791 "ff4ac280-f4cc-4403-ab2f-8926a31f5949" 00:12:22.791 ], 00:12:22.791 "product_name": "Raid Volume", 00:12:22.791 "block_size": 512, 00:12:22.791 "num_blocks": 126976, 00:12:22.791 "uuid": "ff4ac280-f4cc-4403-ab2f-8926a31f5949", 00:12:22.791 "assigned_rate_limits": { 00:12:22.791 "rw_ios_per_sec": 0, 00:12:22.791 "rw_mbytes_per_sec": 0, 00:12:22.791 "r_mbytes_per_sec": 0, 00:12:22.791 "w_mbytes_per_sec": 0 00:12:22.791 }, 00:12:22.791 "claimed": false, 00:12:22.791 "zoned": false, 00:12:22.791 "supported_io_types": { 00:12:22.791 "read": true, 00:12:22.791 "write": true, 00:12:22.791 "unmap": true, 00:12:22.791 "flush": true, 00:12:22.791 "reset": true, 00:12:22.791 "nvme_admin": false, 00:12:22.791 "nvme_io": false, 00:12:22.791 "nvme_io_md": false, 00:12:22.791 "write_zeroes": true, 00:12:22.791 "zcopy": false, 00:12:22.791 "get_zone_info": false, 00:12:22.791 "zone_management": false, 00:12:22.791 "zone_append": false, 00:12:22.791 "compare": false, 00:12:22.791 "compare_and_write": false, 00:12:22.791 "abort": false, 00:12:22.791 "seek_hole": false, 00:12:22.791 "seek_data": false, 00:12:22.791 "copy": false, 00:12:22.791 "nvme_iov_md": false 00:12:22.791 }, 00:12:22.791 "memory_domains": [ 00:12:22.791 { 00:12:22.791 "dma_device_id": "system", 00:12:22.791 "dma_device_type": 1 00:12:22.791 }, 00:12:22.791 { 00:12:22.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.791 "dma_device_type": 2 00:12:22.791 }, 00:12:22.791 { 00:12:22.791 "dma_device_id": "system", 00:12:22.791 "dma_device_type": 1 00:12:22.791 }, 00:12:22.791 { 00:12:22.791 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.791 "dma_device_type": 2 00:12:22.791 } 00:12:22.791 ], 00:12:22.791 "driver_specific": { 00:12:22.791 "raid": { 00:12:22.791 "uuid": "ff4ac280-f4cc-4403-ab2f-8926a31f5949", 00:12:22.791 "strip_size_kb": 64, 00:12:22.791 "state": "online", 00:12:22.791 "raid_level": "concat", 00:12:22.791 "superblock": true, 00:12:22.791 "num_base_bdevs": 2, 00:12:22.791 "num_base_bdevs_discovered": 2, 00:12:22.791 "num_base_bdevs_operational": 2, 00:12:22.791 "base_bdevs_list": [ 00:12:22.791 { 00:12:22.791 "name": "pt1", 00:12:22.791 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:22.791 "is_configured": true, 00:12:22.791 "data_offset": 2048, 00:12:22.791 "data_size": 63488 00:12:22.791 }, 00:12:22.791 { 00:12:22.791 "name": "pt2", 00:12:22.791 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:22.791 "is_configured": true, 00:12:22.791 "data_offset": 2048, 00:12:22.791 "data_size": 63488 00:12:22.791 } 00:12:22.791 ] 00:12:22.791 } 00:12:22.791 } 00:12:22.791 }' 00:12:22.791 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:22.791 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:22.791 pt2' 00:12:22.791 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:22.791 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:22.791 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:23.051 "name": "pt1", 00:12:23.051 "aliases": [ 00:12:23.051 "00000000-0000-0000-0000-000000000001" 00:12:23.051 ], 00:12:23.051 "product_name": "passthru", 00:12:23.051 "block_size": 512, 00:12:23.051 "num_blocks": 65536, 00:12:23.051 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:23.051 "assigned_rate_limits": { 00:12:23.051 "rw_ios_per_sec": 0, 00:12:23.051 "rw_mbytes_per_sec": 0, 00:12:23.051 "r_mbytes_per_sec": 0, 00:12:23.051 "w_mbytes_per_sec": 0 00:12:23.051 }, 00:12:23.051 "claimed": true, 00:12:23.051 "claim_type": "exclusive_write", 00:12:23.051 "zoned": false, 00:12:23.051 "supported_io_types": { 00:12:23.051 "read": true, 00:12:23.051 "write": true, 00:12:23.051 "unmap": true, 00:12:23.051 "flush": true, 00:12:23.051 "reset": true, 00:12:23.051 "nvme_admin": false, 00:12:23.051 "nvme_io": false, 00:12:23.051 "nvme_io_md": false, 00:12:23.051 "write_zeroes": true, 00:12:23.051 "zcopy": true, 00:12:23.051 "get_zone_info": false, 00:12:23.051 "zone_management": false, 00:12:23.051 "zone_append": false, 00:12:23.051 "compare": false, 00:12:23.051 "compare_and_write": false, 00:12:23.051 "abort": true, 00:12:23.051 "seek_hole": false, 00:12:23.051 "seek_data": false, 00:12:23.051 "copy": true, 00:12:23.051 "nvme_iov_md": false 00:12:23.051 }, 00:12:23.051 "memory_domains": [ 00:12:23.051 { 00:12:23.051 "dma_device_id": "system", 00:12:23.051 "dma_device_type": 1 00:12:23.051 }, 00:12:23.051 { 00:12:23.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.051 "dma_device_type": 2 00:12:23.051 } 00:12:23.051 ], 00:12:23.051 "driver_specific": { 00:12:23.051 "passthru": { 00:12:23.051 "name": "pt1", 00:12:23.051 "base_bdev_name": "malloc1" 00:12:23.051 } 00:12:23.051 } 00:12:23.051 }' 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:23.051 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:23.311 13:21:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:23.572 "name": "pt2", 00:12:23.572 "aliases": [ 00:12:23.572 "00000000-0000-0000-0000-000000000002" 00:12:23.572 ], 00:12:23.572 "product_name": "passthru", 00:12:23.572 "block_size": 512, 00:12:23.572 "num_blocks": 65536, 00:12:23.572 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:23.572 "assigned_rate_limits": { 00:12:23.572 "rw_ios_per_sec": 0, 00:12:23.572 "rw_mbytes_per_sec": 0, 00:12:23.572 "r_mbytes_per_sec": 0, 00:12:23.572 "w_mbytes_per_sec": 0 00:12:23.572 }, 00:12:23.572 "claimed": true, 00:12:23.572 "claim_type": "exclusive_write", 00:12:23.572 "zoned": false, 00:12:23.572 "supported_io_types": { 00:12:23.572 "read": true, 00:12:23.572 "write": true, 00:12:23.572 "unmap": true, 00:12:23.572 "flush": true, 00:12:23.572 "reset": true, 00:12:23.572 "nvme_admin": false, 00:12:23.572 "nvme_io": false, 00:12:23.572 "nvme_io_md": false, 00:12:23.572 "write_zeroes": true, 00:12:23.572 "zcopy": true, 00:12:23.572 "get_zone_info": false, 00:12:23.572 "zone_management": false, 00:12:23.572 "zone_append": false, 00:12:23.572 "compare": false, 00:12:23.572 "compare_and_write": false, 00:12:23.572 "abort": true, 00:12:23.572 "seek_hole": false, 00:12:23.572 "seek_data": false, 00:12:23.572 "copy": true, 00:12:23.572 "nvme_iov_md": false 00:12:23.572 }, 00:12:23.572 "memory_domains": [ 00:12:23.572 { 00:12:23.572 "dma_device_id": "system", 00:12:23.572 "dma_device_type": 1 00:12:23.572 }, 00:12:23.572 { 00:12:23.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:23.572 "dma_device_type": 2 00:12:23.572 } 00:12:23.572 ], 00:12:23.572 "driver_specific": { 00:12:23.572 "passthru": { 00:12:23.572 "name": "pt2", 00:12:23.572 "base_bdev_name": "malloc2" 00:12:23.572 } 00:12:23.572 } 00:12:23.572 }' 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:23.572 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.831 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:23.831 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:23.831 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.831 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:23.831 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:23.831 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:23.831 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:12:24.091 [2024-07-25 13:21:04.702778] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ff4ac280-f4cc-4403-ab2f-8926a31f5949 '!=' ff4ac280-f4cc-4403-ab2f-8926a31f5949 ']' 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 882194 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 882194 ']' 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 882194 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 882194 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 882194' 00:12:24.091 killing process with pid 882194 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 882194 00:12:24.091 [2024-07-25 13:21:04.768134] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:24.091 [2024-07-25 13:21:04.768170] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:24.091 [2024-07-25 13:21:04.768200] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:24.091 [2024-07-25 13:21:04.768206] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12a0100 name raid_bdev1, state offline 00:12:24.091 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 882194 00:12:24.091 [2024-07-25 13:21:04.777381] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:24.352 13:21:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:12:24.352 00:12:24.352 real 0m8.957s 00:12:24.352 user 0m16.387s 00:12:24.352 sys 0m1.307s 00:12:24.352 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:24.352 13:21:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.352 ************************************ 00:12:24.352 END TEST raid_superblock_test 00:12:24.352 ************************************ 00:12:24.352 13:21:04 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:12:24.352 13:21:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:24.352 13:21:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:24.352 13:21:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:24.352 ************************************ 00:12:24.352 START TEST raid_read_error_test 00:12:24.352 ************************************ 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 read 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.ge5kFNXOoK 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=884055 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 884055 /var/tmp/spdk-raid.sock 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 884055 ']' 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:24.352 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:24.352 13:21:04 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:24.352 [2024-07-25 13:21:05.047240] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:12:24.352 [2024-07-25 13:21:05.047294] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid884055 ] 00:12:24.352 [2024-07-25 13:21:05.139020] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:24.613 [2024-07-25 13:21:05.206916] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:24.613 [2024-07-25 13:21:05.252540] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.613 [2024-07-25 13:21:05.252566] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:25.184 13:21:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:25.184 13:21:05 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:25.184 13:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:25.184 13:21:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:25.444 BaseBdev1_malloc 00:12:25.444 13:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:25.705 true 00:12:25.705 13:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:25.705 [2024-07-25 13:21:06.422953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:25.705 [2024-07-25 13:21:06.422982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:25.705 [2024-07-25 13:21:06.422998] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26462a0 00:12:25.705 [2024-07-25 13:21:06.423005] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:25.705 [2024-07-25 13:21:06.424304] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:25.705 [2024-07-25 13:21:06.424324] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:25.705 BaseBdev1 00:12:25.705 13:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:25.705 13:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:25.965 BaseBdev2_malloc 00:12:25.965 13:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:26.225 true 00:12:26.225 13:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:26.225 [2024-07-25 13:21:06.978181] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:26.225 [2024-07-25 13:21:06.978210] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:26.225 [2024-07-25 13:21:06.978223] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2705420 00:12:26.225 [2024-07-25 13:21:06.978230] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:26.225 [2024-07-25 13:21:06.979435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:26.225 [2024-07-25 13:21:06.979454] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:26.225 BaseBdev2 00:12:26.225 13:21:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:26.485 [2024-07-25 13:21:07.166683] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:26.485 [2024-07-25 13:21:07.167698] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:26.485 [2024-07-25 13:21:07.167828] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x27056c0 00:12:26.485 [2024-07-25 13:21:07.167835] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:26.485 [2024-07-25 13:21:07.167984] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x270a6f0 00:12:26.485 [2024-07-25 13:21:07.168095] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27056c0 00:12:26.485 [2024-07-25 13:21:07.168100] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27056c0 00:12:26.485 [2024-07-25 13:21:07.168186] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:26.485 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:26.485 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:26.485 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:26.485 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:26.485 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.485 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:26.486 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.486 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.486 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.486 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.486 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.486 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:26.746 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.746 "name": "raid_bdev1", 00:12:26.746 "uuid": "131def8f-c8ab-4ef9-80a8-e3576028c6b4", 00:12:26.746 "strip_size_kb": 64, 00:12:26.746 "state": "online", 00:12:26.746 "raid_level": "concat", 00:12:26.746 "superblock": true, 00:12:26.746 "num_base_bdevs": 2, 00:12:26.746 "num_base_bdevs_discovered": 2, 00:12:26.746 "num_base_bdevs_operational": 2, 00:12:26.746 "base_bdevs_list": [ 00:12:26.746 { 00:12:26.746 "name": "BaseBdev1", 00:12:26.746 "uuid": "2549a263-0045-5675-8b80-ddcf5dac376e", 00:12:26.746 "is_configured": true, 00:12:26.746 "data_offset": 2048, 00:12:26.746 "data_size": 63488 00:12:26.746 }, 00:12:26.746 { 00:12:26.746 "name": "BaseBdev2", 00:12:26.746 "uuid": "5b4b5df1-23ef-55bd-bf8f-25936706cac7", 00:12:26.746 "is_configured": true, 00:12:26.746 "data_offset": 2048, 00:12:26.746 "data_size": 63488 00:12:26.746 } 00:12:26.746 ] 00:12:26.746 }' 00:12:26.746 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.746 13:21:07 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:27.316 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:12:27.316 13:21:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:27.316 [2024-07-25 13:21:07.997029] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2645230 00:12:28.254 13:21:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.513 "name": "raid_bdev1", 00:12:28.513 "uuid": "131def8f-c8ab-4ef9-80a8-e3576028c6b4", 00:12:28.513 "strip_size_kb": 64, 00:12:28.513 "state": "online", 00:12:28.513 "raid_level": "concat", 00:12:28.513 "superblock": true, 00:12:28.513 "num_base_bdevs": 2, 00:12:28.513 "num_base_bdevs_discovered": 2, 00:12:28.513 "num_base_bdevs_operational": 2, 00:12:28.513 "base_bdevs_list": [ 00:12:28.513 { 00:12:28.513 "name": "BaseBdev1", 00:12:28.513 "uuid": "2549a263-0045-5675-8b80-ddcf5dac376e", 00:12:28.513 "is_configured": true, 00:12:28.513 "data_offset": 2048, 00:12:28.513 "data_size": 63488 00:12:28.513 }, 00:12:28.513 { 00:12:28.513 "name": "BaseBdev2", 00:12:28.513 "uuid": "5b4b5df1-23ef-55bd-bf8f-25936706cac7", 00:12:28.513 "is_configured": true, 00:12:28.513 "data_offset": 2048, 00:12:28.513 "data_size": 63488 00:12:28.513 } 00:12:28.513 ] 00:12:28.513 }' 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.513 13:21:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.081 13:21:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:29.340 [2024-07-25 13:21:10.014716] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:29.340 [2024-07-25 13:21:10.014740] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:29.340 [2024-07-25 13:21:10.017322] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:29.340 [2024-07-25 13:21:10.017343] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:29.340 [2024-07-25 13:21:10.017361] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:29.340 [2024-07-25 13:21:10.017368] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27056c0 name raid_bdev1, state offline 00:12:29.340 0 00:12:29.340 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 884055 00:12:29.340 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 884055 ']' 00:12:29.340 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 884055 00:12:29.340 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:12:29.340 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:29.341 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 884055 00:12:29.341 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:29.341 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:29.341 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 884055' 00:12:29.341 killing process with pid 884055 00:12:29.341 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 884055 00:12:29.341 [2024-07-25 13:21:10.083978] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:29.341 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 884055 00:12:29.341 [2024-07-25 13:21:10.090187] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.ge5kFNXOoK 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:12:29.601 00:12:29.601 real 0m5.246s 00:12:29.601 user 0m8.221s 00:12:29.601 sys 0m0.732s 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:29.601 13:21:10 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.601 ************************************ 00:12:29.601 END TEST raid_read_error_test 00:12:29.601 ************************************ 00:12:29.601 13:21:10 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:12:29.601 13:21:10 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:29.601 13:21:10 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:29.601 13:21:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:29.601 ************************************ 00:12:29.601 START TEST raid_write_error_test 00:12:29.601 ************************************ 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 2 write 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.9ZWw4VG757 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=885285 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 885285 /var/tmp/spdk-raid.sock 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 885285 ']' 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:29.601 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:29.601 13:21:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:29.601 [2024-07-25 13:21:10.364989] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:12:29.601 [2024-07-25 13:21:10.365037] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid885285 ] 00:12:29.860 [2024-07-25 13:21:10.455301] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:29.860 [2024-07-25 13:21:10.523731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:29.860 [2024-07-25 13:21:10.565687] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:29.860 [2024-07-25 13:21:10.565711] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:30.428 13:21:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:30.429 13:21:11 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:12:30.429 13:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:30.429 13:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:12:30.688 BaseBdev1_malloc 00:12:30.688 13:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:12:30.947 true 00:12:30.947 13:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:12:30.947 [2024-07-25 13:21:11.728330] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:12:30.947 [2024-07-25 13:21:11.728363] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:30.947 [2024-07-25 13:21:11.728374] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf7b2a0 00:12:30.947 [2024-07-25 13:21:11.728380] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:30.947 [2024-07-25 13:21:11.729702] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:30.947 [2024-07-25 13:21:11.729722] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:12:30.947 BaseBdev1 00:12:31.206 13:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:12:31.206 13:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:12:31.206 BaseBdev2_malloc 00:12:31.206 13:21:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:12:31.466 true 00:12:31.466 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:12:31.725 [2024-07-25 13:21:12.295658] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:12:31.725 [2024-07-25 13:21:12.295687] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:31.725 [2024-07-25 13:21:12.295699] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103a420 00:12:31.725 [2024-07-25 13:21:12.295706] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:31.725 [2024-07-25 13:21:12.296892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:31.725 [2024-07-25 13:21:12.296911] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:12:31.725 BaseBdev2 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:12:31.725 [2024-07-25 13:21:12.472125] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:31.725 [2024-07-25 13:21:12.473129] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:31.725 [2024-07-25 13:21:12.473255] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x103a6c0 00:12:31.725 [2024-07-25 13:21:12.473262] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:12:31.725 [2024-07-25 13:21:12.473409] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x103f6f0 00:12:31.725 [2024-07-25 13:21:12.473520] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x103a6c0 00:12:31.725 [2024-07-25 13:21:12.473525] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x103a6c0 00:12:31.725 [2024-07-25 13:21:12.473616] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.725 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:31.984 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.984 "name": "raid_bdev1", 00:12:31.984 "uuid": "f9eb9f5d-ad3b-4a04-9250-496cfc5cf996", 00:12:31.984 "strip_size_kb": 64, 00:12:31.984 "state": "online", 00:12:31.984 "raid_level": "concat", 00:12:31.984 "superblock": true, 00:12:31.984 "num_base_bdevs": 2, 00:12:31.984 "num_base_bdevs_discovered": 2, 00:12:31.984 "num_base_bdevs_operational": 2, 00:12:31.985 "base_bdevs_list": [ 00:12:31.985 { 00:12:31.985 "name": "BaseBdev1", 00:12:31.985 "uuid": "d711a88f-7795-5b85-ac73-a49053524cf7", 00:12:31.985 "is_configured": true, 00:12:31.985 "data_offset": 2048, 00:12:31.985 "data_size": 63488 00:12:31.985 }, 00:12:31.985 { 00:12:31.985 "name": "BaseBdev2", 00:12:31.985 "uuid": "58a74b76-26d7-5a1f-a65c-0a909f2a373c", 00:12:31.985 "is_configured": true, 00:12:31.985 "data_offset": 2048, 00:12:31.985 "data_size": 63488 00:12:31.985 } 00:12:31.985 ] 00:12:31.985 }' 00:12:31.985 13:21:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.985 13:21:12 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:32.555 13:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:12:32.555 13:21:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:12:32.555 [2024-07-25 13:21:13.338670] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf7a230 00:12:33.562 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:33.821 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.080 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.080 "name": "raid_bdev1", 00:12:34.080 "uuid": "f9eb9f5d-ad3b-4a04-9250-496cfc5cf996", 00:12:34.080 "strip_size_kb": 64, 00:12:34.080 "state": "online", 00:12:34.080 "raid_level": "concat", 00:12:34.080 "superblock": true, 00:12:34.080 "num_base_bdevs": 2, 00:12:34.080 "num_base_bdevs_discovered": 2, 00:12:34.080 "num_base_bdevs_operational": 2, 00:12:34.081 "base_bdevs_list": [ 00:12:34.081 { 00:12:34.081 "name": "BaseBdev1", 00:12:34.081 "uuid": "d711a88f-7795-5b85-ac73-a49053524cf7", 00:12:34.081 "is_configured": true, 00:12:34.081 "data_offset": 2048, 00:12:34.081 "data_size": 63488 00:12:34.081 }, 00:12:34.081 { 00:12:34.081 "name": "BaseBdev2", 00:12:34.081 "uuid": "58a74b76-26d7-5a1f-a65c-0a909f2a373c", 00:12:34.081 "is_configured": true, 00:12:34.081 "data_offset": 2048, 00:12:34.081 "data_size": 63488 00:12:34.081 } 00:12:34.081 ] 00:12:34.081 }' 00:12:34.081 13:21:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.081 13:21:14 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.646 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:34.905 [2024-07-25 13:21:15.441391] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:34.905 [2024-07-25 13:21:15.441418] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:34.905 [2024-07-25 13:21:15.443999] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:34.905 [2024-07-25 13:21:15.444020] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:34.905 [2024-07-25 13:21:15.444038] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:34.905 [2024-07-25 13:21:15.444045] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103a6c0 name raid_bdev1, state offline 00:12:34.905 0 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 885285 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 885285 ']' 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 885285 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 885285 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 885285' 00:12:34.905 killing process with pid 885285 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 885285 00:12:34.905 [2024-07-25 13:21:15.510344] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 885285 00:12:34.905 [2024-07-25 13:21:15.515817] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.9ZWw4VG757 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.48 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.48 != \0\.\0\0 ]] 00:12:34.905 00:12:34.905 real 0m5.349s 00:12:34.905 user 0m8.434s 00:12:34.905 sys 0m0.735s 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:34.905 13:21:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:12:34.905 ************************************ 00:12:34.905 END TEST raid_write_error_test 00:12:34.905 ************************************ 00:12:34.905 13:21:15 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:12:34.905 13:21:15 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:12:34.905 13:21:15 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:34.905 13:21:15 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:34.905 13:21:15 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:35.164 ************************************ 00:12:35.164 START TEST raid_state_function_test 00:12:35.164 ************************************ 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 false 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=886532 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 886532' 00:12:35.164 Process raid pid: 886532 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 886532 /var/tmp/spdk-raid.sock 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 886532 ']' 00:12:35.164 13:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:35.165 13:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:35.165 13:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:35.165 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:35.165 13:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:35.165 13:21:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:35.165 [2024-07-25 13:21:15.777141] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:12:35.165 [2024-07-25 13:21:15.777205] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:35.165 [2024-07-25 13:21:15.876156] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:35.165 [2024-07-25 13:21:15.939783] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:35.423 [2024-07-25 13:21:15.983147] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.423 [2024-07-25 13:21:15.983171] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:35.992 13:21:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:35.992 13:21:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:12:35.992 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:35.992 [2024-07-25 13:21:16.774511] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:35.992 [2024-07-25 13:21:16.774539] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:35.992 [2024-07-25 13:21:16.774550] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:35.992 [2024-07-25 13:21:16.774556] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:36.251 "name": "Existed_Raid", 00:12:36.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.251 "strip_size_kb": 0, 00:12:36.251 "state": "configuring", 00:12:36.251 "raid_level": "raid1", 00:12:36.251 "superblock": false, 00:12:36.251 "num_base_bdevs": 2, 00:12:36.251 "num_base_bdevs_discovered": 0, 00:12:36.251 "num_base_bdevs_operational": 2, 00:12:36.251 "base_bdevs_list": [ 00:12:36.251 { 00:12:36.251 "name": "BaseBdev1", 00:12:36.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.251 "is_configured": false, 00:12:36.251 "data_offset": 0, 00:12:36.251 "data_size": 0 00:12:36.251 }, 00:12:36.251 { 00:12:36.251 "name": "BaseBdev2", 00:12:36.251 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:36.251 "is_configured": false, 00:12:36.251 "data_offset": 0, 00:12:36.251 "data_size": 0 00:12:36.251 } 00:12:36.251 ] 00:12:36.251 }' 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:36.251 13:21:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:36.819 13:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:37.078 [2024-07-25 13:21:17.708790] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:37.078 [2024-07-25 13:21:17.708810] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x262d6b0 name Existed_Raid, state configuring 00:12:37.078 13:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:37.338 [2024-07-25 13:21:17.905302] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:37.338 [2024-07-25 13:21:17.905321] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:37.338 [2024-07-25 13:21:17.905327] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:37.338 [2024-07-25 13:21:17.905333] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:37.338 13:21:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:37.338 [2024-07-25 13:21:18.096267] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:37.338 BaseBdev1 00:12:37.338 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:37.338 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:37.338 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:37.338 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:37.338 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:37.338 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:37.338 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.597 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:37.857 [ 00:12:37.857 { 00:12:37.857 "name": "BaseBdev1", 00:12:37.857 "aliases": [ 00:12:37.857 "5e34a745-88a5-437d-8162-1146c880a037" 00:12:37.857 ], 00:12:37.857 "product_name": "Malloc disk", 00:12:37.857 "block_size": 512, 00:12:37.857 "num_blocks": 65536, 00:12:37.857 "uuid": "5e34a745-88a5-437d-8162-1146c880a037", 00:12:37.857 "assigned_rate_limits": { 00:12:37.857 "rw_ios_per_sec": 0, 00:12:37.857 "rw_mbytes_per_sec": 0, 00:12:37.857 "r_mbytes_per_sec": 0, 00:12:37.857 "w_mbytes_per_sec": 0 00:12:37.857 }, 00:12:37.857 "claimed": true, 00:12:37.857 "claim_type": "exclusive_write", 00:12:37.857 "zoned": false, 00:12:37.857 "supported_io_types": { 00:12:37.857 "read": true, 00:12:37.857 "write": true, 00:12:37.857 "unmap": true, 00:12:37.857 "flush": true, 00:12:37.857 "reset": true, 00:12:37.857 "nvme_admin": false, 00:12:37.857 "nvme_io": false, 00:12:37.857 "nvme_io_md": false, 00:12:37.857 "write_zeroes": true, 00:12:37.857 "zcopy": true, 00:12:37.857 "get_zone_info": false, 00:12:37.857 "zone_management": false, 00:12:37.857 "zone_append": false, 00:12:37.857 "compare": false, 00:12:37.857 "compare_and_write": false, 00:12:37.857 "abort": true, 00:12:37.857 "seek_hole": false, 00:12:37.857 "seek_data": false, 00:12:37.857 "copy": true, 00:12:37.857 "nvme_iov_md": false 00:12:37.857 }, 00:12:37.857 "memory_domains": [ 00:12:37.857 { 00:12:37.857 "dma_device_id": "system", 00:12:37.857 "dma_device_type": 1 00:12:37.857 }, 00:12:37.857 { 00:12:37.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.857 "dma_device_type": 2 00:12:37.857 } 00:12:37.857 ], 00:12:37.857 "driver_specific": {} 00:12:37.857 } 00:12:37.857 ] 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:37.857 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.116 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.116 "name": "Existed_Raid", 00:12:38.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.116 "strip_size_kb": 0, 00:12:38.116 "state": "configuring", 00:12:38.116 "raid_level": "raid1", 00:12:38.116 "superblock": false, 00:12:38.116 "num_base_bdevs": 2, 00:12:38.116 "num_base_bdevs_discovered": 1, 00:12:38.116 "num_base_bdevs_operational": 2, 00:12:38.116 "base_bdevs_list": [ 00:12:38.116 { 00:12:38.116 "name": "BaseBdev1", 00:12:38.116 "uuid": "5e34a745-88a5-437d-8162-1146c880a037", 00:12:38.116 "is_configured": true, 00:12:38.116 "data_offset": 0, 00:12:38.116 "data_size": 65536 00:12:38.116 }, 00:12:38.116 { 00:12:38.116 "name": "BaseBdev2", 00:12:38.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.116 "is_configured": false, 00:12:38.116 "data_offset": 0, 00:12:38.116 "data_size": 0 00:12:38.116 } 00:12:38.116 ] 00:12:38.116 }' 00:12:38.116 13:21:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.116 13:21:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:38.685 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:38.685 [2024-07-25 13:21:19.399558] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:38.685 [2024-07-25 13:21:19.399584] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x262cfa0 name Existed_Raid, state configuring 00:12:38.685 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:38.945 [2024-07-25 13:21:19.596069] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:38.945 [2024-07-25 13:21:19.597199] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:38.945 [2024-07-25 13:21:19.597225] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.945 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.204 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.204 "name": "Existed_Raid", 00:12:39.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.204 "strip_size_kb": 0, 00:12:39.204 "state": "configuring", 00:12:39.204 "raid_level": "raid1", 00:12:39.204 "superblock": false, 00:12:39.204 "num_base_bdevs": 2, 00:12:39.204 "num_base_bdevs_discovered": 1, 00:12:39.204 "num_base_bdevs_operational": 2, 00:12:39.204 "base_bdevs_list": [ 00:12:39.204 { 00:12:39.204 "name": "BaseBdev1", 00:12:39.204 "uuid": "5e34a745-88a5-437d-8162-1146c880a037", 00:12:39.204 "is_configured": true, 00:12:39.204 "data_offset": 0, 00:12:39.204 "data_size": 65536 00:12:39.204 }, 00:12:39.204 { 00:12:39.204 "name": "BaseBdev2", 00:12:39.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.204 "is_configured": false, 00:12:39.204 "data_offset": 0, 00:12:39.204 "data_size": 0 00:12:39.204 } 00:12:39.204 ] 00:12:39.204 }' 00:12:39.204 13:21:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.204 13:21:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:39.774 [2024-07-25 13:21:20.543351] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:39.774 [2024-07-25 13:21:20.543378] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x262dda0 00:12:39.774 [2024-07-25 13:21:20.543382] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:12:39.774 [2024-07-25 13:21:20.543527] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27d1990 00:12:39.774 [2024-07-25 13:21:20.543628] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x262dda0 00:12:39.774 [2024-07-25 13:21:20.543634] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x262dda0 00:12:39.774 [2024-07-25 13:21:20.543756] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:39.774 BaseBdev2 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:39.774 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:40.033 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:40.293 [ 00:12:40.293 { 00:12:40.293 "name": "BaseBdev2", 00:12:40.293 "aliases": [ 00:12:40.293 "83e68e6d-15e3-4c16-b0c8-9b720e86b10d" 00:12:40.293 ], 00:12:40.293 "product_name": "Malloc disk", 00:12:40.293 "block_size": 512, 00:12:40.293 "num_blocks": 65536, 00:12:40.293 "uuid": "83e68e6d-15e3-4c16-b0c8-9b720e86b10d", 00:12:40.293 "assigned_rate_limits": { 00:12:40.293 "rw_ios_per_sec": 0, 00:12:40.293 "rw_mbytes_per_sec": 0, 00:12:40.293 "r_mbytes_per_sec": 0, 00:12:40.293 "w_mbytes_per_sec": 0 00:12:40.293 }, 00:12:40.293 "claimed": true, 00:12:40.293 "claim_type": "exclusive_write", 00:12:40.293 "zoned": false, 00:12:40.293 "supported_io_types": { 00:12:40.293 "read": true, 00:12:40.293 "write": true, 00:12:40.293 "unmap": true, 00:12:40.293 "flush": true, 00:12:40.293 "reset": true, 00:12:40.293 "nvme_admin": false, 00:12:40.293 "nvme_io": false, 00:12:40.293 "nvme_io_md": false, 00:12:40.293 "write_zeroes": true, 00:12:40.293 "zcopy": true, 00:12:40.293 "get_zone_info": false, 00:12:40.293 "zone_management": false, 00:12:40.293 "zone_append": false, 00:12:40.293 "compare": false, 00:12:40.293 "compare_and_write": false, 00:12:40.293 "abort": true, 00:12:40.293 "seek_hole": false, 00:12:40.293 "seek_data": false, 00:12:40.293 "copy": true, 00:12:40.293 "nvme_iov_md": false 00:12:40.293 }, 00:12:40.293 "memory_domains": [ 00:12:40.293 { 00:12:40.293 "dma_device_id": "system", 00:12:40.293 "dma_device_type": 1 00:12:40.293 }, 00:12:40.293 { 00:12:40.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:40.293 "dma_device_type": 2 00:12:40.293 } 00:12:40.293 ], 00:12:40.293 "driver_specific": {} 00:12:40.293 } 00:12:40.293 ] 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.293 13:21:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:40.552 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:40.552 "name": "Existed_Raid", 00:12:40.552 "uuid": "fdef24ef-a8ad-4aad-bd5d-86160c2d1f80", 00:12:40.552 "strip_size_kb": 0, 00:12:40.552 "state": "online", 00:12:40.552 "raid_level": "raid1", 00:12:40.552 "superblock": false, 00:12:40.552 "num_base_bdevs": 2, 00:12:40.552 "num_base_bdevs_discovered": 2, 00:12:40.552 "num_base_bdevs_operational": 2, 00:12:40.552 "base_bdevs_list": [ 00:12:40.552 { 00:12:40.552 "name": "BaseBdev1", 00:12:40.552 "uuid": "5e34a745-88a5-437d-8162-1146c880a037", 00:12:40.552 "is_configured": true, 00:12:40.552 "data_offset": 0, 00:12:40.552 "data_size": 65536 00:12:40.552 }, 00:12:40.552 { 00:12:40.552 "name": "BaseBdev2", 00:12:40.552 "uuid": "83e68e6d-15e3-4c16-b0c8-9b720e86b10d", 00:12:40.552 "is_configured": true, 00:12:40.552 "data_offset": 0, 00:12:40.552 "data_size": 65536 00:12:40.552 } 00:12:40.552 ] 00:12:40.552 }' 00:12:40.552 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:40.552 13:21:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:41.121 [2024-07-25 13:21:21.810766] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:41.121 "name": "Existed_Raid", 00:12:41.121 "aliases": [ 00:12:41.121 "fdef24ef-a8ad-4aad-bd5d-86160c2d1f80" 00:12:41.121 ], 00:12:41.121 "product_name": "Raid Volume", 00:12:41.121 "block_size": 512, 00:12:41.121 "num_blocks": 65536, 00:12:41.121 "uuid": "fdef24ef-a8ad-4aad-bd5d-86160c2d1f80", 00:12:41.121 "assigned_rate_limits": { 00:12:41.121 "rw_ios_per_sec": 0, 00:12:41.121 "rw_mbytes_per_sec": 0, 00:12:41.121 "r_mbytes_per_sec": 0, 00:12:41.121 "w_mbytes_per_sec": 0 00:12:41.121 }, 00:12:41.121 "claimed": false, 00:12:41.121 "zoned": false, 00:12:41.121 "supported_io_types": { 00:12:41.121 "read": true, 00:12:41.121 "write": true, 00:12:41.121 "unmap": false, 00:12:41.121 "flush": false, 00:12:41.121 "reset": true, 00:12:41.121 "nvme_admin": false, 00:12:41.121 "nvme_io": false, 00:12:41.121 "nvme_io_md": false, 00:12:41.121 "write_zeroes": true, 00:12:41.121 "zcopy": false, 00:12:41.121 "get_zone_info": false, 00:12:41.121 "zone_management": false, 00:12:41.121 "zone_append": false, 00:12:41.121 "compare": false, 00:12:41.121 "compare_and_write": false, 00:12:41.121 "abort": false, 00:12:41.121 "seek_hole": false, 00:12:41.121 "seek_data": false, 00:12:41.121 "copy": false, 00:12:41.121 "nvme_iov_md": false 00:12:41.121 }, 00:12:41.121 "memory_domains": [ 00:12:41.121 { 00:12:41.121 "dma_device_id": "system", 00:12:41.121 "dma_device_type": 1 00:12:41.121 }, 00:12:41.121 { 00:12:41.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.121 "dma_device_type": 2 00:12:41.121 }, 00:12:41.121 { 00:12:41.121 "dma_device_id": "system", 00:12:41.121 "dma_device_type": 1 00:12:41.121 }, 00:12:41.121 { 00:12:41.121 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.121 "dma_device_type": 2 00:12:41.121 } 00:12:41.121 ], 00:12:41.121 "driver_specific": { 00:12:41.121 "raid": { 00:12:41.121 "uuid": "fdef24ef-a8ad-4aad-bd5d-86160c2d1f80", 00:12:41.121 "strip_size_kb": 0, 00:12:41.121 "state": "online", 00:12:41.121 "raid_level": "raid1", 00:12:41.121 "superblock": false, 00:12:41.121 "num_base_bdevs": 2, 00:12:41.121 "num_base_bdevs_discovered": 2, 00:12:41.121 "num_base_bdevs_operational": 2, 00:12:41.121 "base_bdevs_list": [ 00:12:41.121 { 00:12:41.121 "name": "BaseBdev1", 00:12:41.121 "uuid": "5e34a745-88a5-437d-8162-1146c880a037", 00:12:41.121 "is_configured": true, 00:12:41.121 "data_offset": 0, 00:12:41.121 "data_size": 65536 00:12:41.121 }, 00:12:41.121 { 00:12:41.121 "name": "BaseBdev2", 00:12:41.121 "uuid": "83e68e6d-15e3-4c16-b0c8-9b720e86b10d", 00:12:41.121 "is_configured": true, 00:12:41.121 "data_offset": 0, 00:12:41.121 "data_size": 65536 00:12:41.121 } 00:12:41.121 ] 00:12:41.121 } 00:12:41.121 } 00:12:41.121 }' 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:41.121 BaseBdev2' 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:41.121 13:21:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:41.381 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:41.381 "name": "BaseBdev1", 00:12:41.381 "aliases": [ 00:12:41.381 "5e34a745-88a5-437d-8162-1146c880a037" 00:12:41.381 ], 00:12:41.381 "product_name": "Malloc disk", 00:12:41.381 "block_size": 512, 00:12:41.381 "num_blocks": 65536, 00:12:41.381 "uuid": "5e34a745-88a5-437d-8162-1146c880a037", 00:12:41.381 "assigned_rate_limits": { 00:12:41.381 "rw_ios_per_sec": 0, 00:12:41.381 "rw_mbytes_per_sec": 0, 00:12:41.381 "r_mbytes_per_sec": 0, 00:12:41.381 "w_mbytes_per_sec": 0 00:12:41.381 }, 00:12:41.381 "claimed": true, 00:12:41.381 "claim_type": "exclusive_write", 00:12:41.381 "zoned": false, 00:12:41.381 "supported_io_types": { 00:12:41.381 "read": true, 00:12:41.381 "write": true, 00:12:41.381 "unmap": true, 00:12:41.381 "flush": true, 00:12:41.381 "reset": true, 00:12:41.381 "nvme_admin": false, 00:12:41.381 "nvme_io": false, 00:12:41.381 "nvme_io_md": false, 00:12:41.381 "write_zeroes": true, 00:12:41.381 "zcopy": true, 00:12:41.381 "get_zone_info": false, 00:12:41.381 "zone_management": false, 00:12:41.381 "zone_append": false, 00:12:41.381 "compare": false, 00:12:41.381 "compare_and_write": false, 00:12:41.381 "abort": true, 00:12:41.381 "seek_hole": false, 00:12:41.381 "seek_data": false, 00:12:41.381 "copy": true, 00:12:41.381 "nvme_iov_md": false 00:12:41.381 }, 00:12:41.381 "memory_domains": [ 00:12:41.381 { 00:12:41.381 "dma_device_id": "system", 00:12:41.381 "dma_device_type": 1 00:12:41.381 }, 00:12:41.381 { 00:12:41.381 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.381 "dma_device_type": 2 00:12:41.381 } 00:12:41.381 ], 00:12:41.381 "driver_specific": {} 00:12:41.381 }' 00:12:41.381 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.381 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:41.381 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:41.381 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.641 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:41.641 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:41.641 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.641 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:41.641 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:41.641 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.901 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:41.901 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:41.901 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:41.901 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:41.901 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:42.161 "name": "BaseBdev2", 00:12:42.161 "aliases": [ 00:12:42.161 "83e68e6d-15e3-4c16-b0c8-9b720e86b10d" 00:12:42.161 ], 00:12:42.161 "product_name": "Malloc disk", 00:12:42.161 "block_size": 512, 00:12:42.161 "num_blocks": 65536, 00:12:42.161 "uuid": "83e68e6d-15e3-4c16-b0c8-9b720e86b10d", 00:12:42.161 "assigned_rate_limits": { 00:12:42.161 "rw_ios_per_sec": 0, 00:12:42.161 "rw_mbytes_per_sec": 0, 00:12:42.161 "r_mbytes_per_sec": 0, 00:12:42.161 "w_mbytes_per_sec": 0 00:12:42.161 }, 00:12:42.161 "claimed": true, 00:12:42.161 "claim_type": "exclusive_write", 00:12:42.161 "zoned": false, 00:12:42.161 "supported_io_types": { 00:12:42.161 "read": true, 00:12:42.161 "write": true, 00:12:42.161 "unmap": true, 00:12:42.161 "flush": true, 00:12:42.161 "reset": true, 00:12:42.161 "nvme_admin": false, 00:12:42.161 "nvme_io": false, 00:12:42.161 "nvme_io_md": false, 00:12:42.161 "write_zeroes": true, 00:12:42.161 "zcopy": true, 00:12:42.161 "get_zone_info": false, 00:12:42.161 "zone_management": false, 00:12:42.161 "zone_append": false, 00:12:42.161 "compare": false, 00:12:42.161 "compare_and_write": false, 00:12:42.161 "abort": true, 00:12:42.161 "seek_hole": false, 00:12:42.161 "seek_data": false, 00:12:42.161 "copy": true, 00:12:42.161 "nvme_iov_md": false 00:12:42.161 }, 00:12:42.161 "memory_domains": [ 00:12:42.161 { 00:12:42.161 "dma_device_id": "system", 00:12:42.161 "dma_device_type": 1 00:12:42.161 }, 00:12:42.161 { 00:12:42.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:42.161 "dma_device_type": 2 00:12:42.161 } 00:12:42.161 ], 00:12:42.161 "driver_specific": {} 00:12:42.161 }' 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:42.161 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.421 13:21:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:42.421 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:42.421 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.421 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:42.421 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:42.421 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:42.682 [2024-07-25 13:21:23.386580] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.682 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:42.941 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:42.941 "name": "Existed_Raid", 00:12:42.941 "uuid": "fdef24ef-a8ad-4aad-bd5d-86160c2d1f80", 00:12:42.941 "strip_size_kb": 0, 00:12:42.941 "state": "online", 00:12:42.941 "raid_level": "raid1", 00:12:42.941 "superblock": false, 00:12:42.941 "num_base_bdevs": 2, 00:12:42.941 "num_base_bdevs_discovered": 1, 00:12:42.941 "num_base_bdevs_operational": 1, 00:12:42.941 "base_bdevs_list": [ 00:12:42.941 { 00:12:42.941 "name": null, 00:12:42.941 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:42.941 "is_configured": false, 00:12:42.941 "data_offset": 0, 00:12:42.941 "data_size": 65536 00:12:42.941 }, 00:12:42.941 { 00:12:42.941 "name": "BaseBdev2", 00:12:42.941 "uuid": "83e68e6d-15e3-4c16-b0c8-9b720e86b10d", 00:12:42.941 "is_configured": true, 00:12:42.941 "data_offset": 0, 00:12:42.941 "data_size": 65536 00:12:42.941 } 00:12:42.941 ] 00:12:42.941 }' 00:12:42.941 13:21:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:42.941 13:21:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:43.510 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:43.510 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:43.510 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:43.510 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:43.771 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:43.771 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:43.771 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:44.031 [2024-07-25 13:21:24.613683] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:44.031 [2024-07-25 13:21:24.613738] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:44.031 [2024-07-25 13:21:24.619755] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:44.031 [2024-07-25 13:21:24.619778] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:44.031 [2024-07-25 13:21:24.619785] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x262dda0 name Existed_Raid, state offline 00:12:44.031 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:44.031 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:44.031 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.031 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 886532 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 886532 ']' 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 886532 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 886532 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 886532' 00:12:44.293 killing process with pid 886532 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 886532 00:12:44.293 [2024-07-25 13:21:24.949875] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:44.293 13:21:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 886532 00:12:44.293 [2024-07-25 13:21:24.950474] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:44.293 13:21:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:44.293 00:12:44.293 real 0m9.347s 00:12:44.293 user 0m17.064s 00:12:44.293 sys 0m1.415s 00:12:44.293 13:21:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:44.293 13:21:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:44.293 ************************************ 00:12:44.293 END TEST raid_state_function_test 00:12:44.293 ************************************ 00:12:44.555 13:21:25 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:12:44.555 13:21:25 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:12:44.555 13:21:25 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:44.555 13:21:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:44.555 ************************************ 00:12:44.555 START TEST raid_state_function_test_sb 00:12:44.555 ************************************ 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=888290 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 888290' 00:12:44.555 Process raid pid: 888290 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 888290 /var/tmp/spdk-raid.sock 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 888290 ']' 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:44.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:44.555 13:21:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:44.555 [2024-07-25 13:21:25.200755] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:12:44.555 [2024-07-25 13:21:25.200809] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:44.555 [2024-07-25 13:21:25.289899] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:44.815 [2024-07-25 13:21:25.356064] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:44.815 [2024-07-25 13:21:25.396564] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:44.815 [2024-07-25 13:21:25.396586] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:45.385 13:21:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:45.385 13:21:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:12:45.385 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:45.645 [2024-07-25 13:21:26.283759] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:45.645 [2024-07-25 13:21:26.283790] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:45.645 [2024-07-25 13:21:26.283796] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:45.645 [2024-07-25 13:21:26.283802] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.645 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:46.217 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:46.217 "name": "Existed_Raid", 00:12:46.217 "uuid": "37fbc89b-599d-4d9b-a7d7-48f05cd1f646", 00:12:46.217 "strip_size_kb": 0, 00:12:46.217 "state": "configuring", 00:12:46.217 "raid_level": "raid1", 00:12:46.217 "superblock": true, 00:12:46.217 "num_base_bdevs": 2, 00:12:46.217 "num_base_bdevs_discovered": 0, 00:12:46.217 "num_base_bdevs_operational": 2, 00:12:46.217 "base_bdevs_list": [ 00:12:46.217 { 00:12:46.217 "name": "BaseBdev1", 00:12:46.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:46.217 "is_configured": false, 00:12:46.217 "data_offset": 0, 00:12:46.217 "data_size": 0 00:12:46.217 }, 00:12:46.217 { 00:12:46.217 "name": "BaseBdev2", 00:12:46.217 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:46.217 "is_configured": false, 00:12:46.217 "data_offset": 0, 00:12:46.217 "data_size": 0 00:12:46.217 } 00:12:46.217 ] 00:12:46.217 }' 00:12:46.217 13:21:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:46.217 13:21:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.787 13:21:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:47.047 [2024-07-25 13:21:27.687168] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:47.047 [2024-07-25 13:21:27.687188] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fde6b0 name Existed_Raid, state configuring 00:12:47.047 13:21:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:47.307 [2024-07-25 13:21:27.887692] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:47.307 [2024-07-25 13:21:27.887708] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:47.307 [2024-07-25 13:21:27.887713] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:47.307 [2024-07-25 13:21:27.887718] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:47.307 13:21:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:47.307 [2024-07-25 13:21:28.086895] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:47.307 BaseBdev1 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:47.567 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:47.833 [ 00:12:47.834 { 00:12:47.834 "name": "BaseBdev1", 00:12:47.834 "aliases": [ 00:12:47.834 "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd" 00:12:47.834 ], 00:12:47.834 "product_name": "Malloc disk", 00:12:47.834 "block_size": 512, 00:12:47.834 "num_blocks": 65536, 00:12:47.834 "uuid": "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd", 00:12:47.834 "assigned_rate_limits": { 00:12:47.834 "rw_ios_per_sec": 0, 00:12:47.834 "rw_mbytes_per_sec": 0, 00:12:47.834 "r_mbytes_per_sec": 0, 00:12:47.834 "w_mbytes_per_sec": 0 00:12:47.834 }, 00:12:47.834 "claimed": true, 00:12:47.834 "claim_type": "exclusive_write", 00:12:47.834 "zoned": false, 00:12:47.834 "supported_io_types": { 00:12:47.834 "read": true, 00:12:47.834 "write": true, 00:12:47.834 "unmap": true, 00:12:47.834 "flush": true, 00:12:47.834 "reset": true, 00:12:47.834 "nvme_admin": false, 00:12:47.834 "nvme_io": false, 00:12:47.834 "nvme_io_md": false, 00:12:47.834 "write_zeroes": true, 00:12:47.834 "zcopy": true, 00:12:47.834 "get_zone_info": false, 00:12:47.834 "zone_management": false, 00:12:47.834 "zone_append": false, 00:12:47.834 "compare": false, 00:12:47.834 "compare_and_write": false, 00:12:47.834 "abort": true, 00:12:47.834 "seek_hole": false, 00:12:47.834 "seek_data": false, 00:12:47.834 "copy": true, 00:12:47.834 "nvme_iov_md": false 00:12:47.834 }, 00:12:47.834 "memory_domains": [ 00:12:47.834 { 00:12:47.834 "dma_device_id": "system", 00:12:47.834 "dma_device_type": 1 00:12:47.834 }, 00:12:47.834 { 00:12:47.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:47.834 "dma_device_type": 2 00:12:47.834 } 00:12:47.834 ], 00:12:47.834 "driver_specific": {} 00:12:47.834 } 00:12:47.834 ] 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.834 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:48.097 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:48.097 "name": "Existed_Raid", 00:12:48.097 "uuid": "1e8be5fe-926f-45ef-a1a0-2f038907d8d5", 00:12:48.097 "strip_size_kb": 0, 00:12:48.097 "state": "configuring", 00:12:48.097 "raid_level": "raid1", 00:12:48.097 "superblock": true, 00:12:48.097 "num_base_bdevs": 2, 00:12:48.097 "num_base_bdevs_discovered": 1, 00:12:48.097 "num_base_bdevs_operational": 2, 00:12:48.098 "base_bdevs_list": [ 00:12:48.098 { 00:12:48.098 "name": "BaseBdev1", 00:12:48.098 "uuid": "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd", 00:12:48.098 "is_configured": true, 00:12:48.098 "data_offset": 2048, 00:12:48.098 "data_size": 63488 00:12:48.098 }, 00:12:48.098 { 00:12:48.098 "name": "BaseBdev2", 00:12:48.098 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:48.098 "is_configured": false, 00:12:48.098 "data_offset": 0, 00:12:48.098 "data_size": 0 00:12:48.098 } 00:12:48.098 ] 00:12:48.098 }' 00:12:48.098 13:21:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:48.098 13:21:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:48.668 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:48.668 [2024-07-25 13:21:29.386169] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:48.668 [2024-07-25 13:21:29.386195] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fddfa0 name Existed_Raid, state configuring 00:12:48.668 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:12:48.928 [2024-07-25 13:21:29.582693] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:48.928 [2024-07-25 13:21:29.583825] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:48.928 [2024-07-25 13:21:29.583850] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.928 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.188 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.188 "name": "Existed_Raid", 00:12:49.188 "uuid": "705c921e-374e-49b7-80dc-6af6cbcdd805", 00:12:49.188 "strip_size_kb": 0, 00:12:49.188 "state": "configuring", 00:12:49.188 "raid_level": "raid1", 00:12:49.188 "superblock": true, 00:12:49.188 "num_base_bdevs": 2, 00:12:49.188 "num_base_bdevs_discovered": 1, 00:12:49.188 "num_base_bdevs_operational": 2, 00:12:49.188 "base_bdevs_list": [ 00:12:49.188 { 00:12:49.188 "name": "BaseBdev1", 00:12:49.188 "uuid": "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd", 00:12:49.188 "is_configured": true, 00:12:49.188 "data_offset": 2048, 00:12:49.188 "data_size": 63488 00:12:49.188 }, 00:12:49.188 { 00:12:49.188 "name": "BaseBdev2", 00:12:49.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:49.188 "is_configured": false, 00:12:49.188 "data_offset": 0, 00:12:49.188 "data_size": 0 00:12:49.188 } 00:12:49.188 ] 00:12:49.188 }' 00:12:49.188 13:21:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.188 13:21:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:50.128 [2024-07-25 13:21:30.874723] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:50.128 [2024-07-25 13:21:30.874836] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fdeda0 00:12:50.128 [2024-07-25 13:21:30.874844] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:50.128 [2024-07-25 13:21:30.874981] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fddb00 00:12:50.128 [2024-07-25 13:21:30.875074] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fdeda0 00:12:50.128 [2024-07-25 13:21:30.875080] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fdeda0 00:12:50.128 [2024-07-25 13:21:30.875148] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:50.128 BaseBdev2 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:12:50.128 13:21:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:50.388 13:21:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:50.647 [ 00:12:50.647 { 00:12:50.647 "name": "BaseBdev2", 00:12:50.647 "aliases": [ 00:12:50.647 "515b1d08-f8a0-4110-b356-6f25f8366e4e" 00:12:50.647 ], 00:12:50.647 "product_name": "Malloc disk", 00:12:50.647 "block_size": 512, 00:12:50.647 "num_blocks": 65536, 00:12:50.647 "uuid": "515b1d08-f8a0-4110-b356-6f25f8366e4e", 00:12:50.647 "assigned_rate_limits": { 00:12:50.647 "rw_ios_per_sec": 0, 00:12:50.647 "rw_mbytes_per_sec": 0, 00:12:50.647 "r_mbytes_per_sec": 0, 00:12:50.647 "w_mbytes_per_sec": 0 00:12:50.647 }, 00:12:50.647 "claimed": true, 00:12:50.647 "claim_type": "exclusive_write", 00:12:50.647 "zoned": false, 00:12:50.647 "supported_io_types": { 00:12:50.647 "read": true, 00:12:50.647 "write": true, 00:12:50.647 "unmap": true, 00:12:50.647 "flush": true, 00:12:50.647 "reset": true, 00:12:50.647 "nvme_admin": false, 00:12:50.647 "nvme_io": false, 00:12:50.648 "nvme_io_md": false, 00:12:50.648 "write_zeroes": true, 00:12:50.648 "zcopy": true, 00:12:50.648 "get_zone_info": false, 00:12:50.648 "zone_management": false, 00:12:50.648 "zone_append": false, 00:12:50.648 "compare": false, 00:12:50.648 "compare_and_write": false, 00:12:50.648 "abort": true, 00:12:50.648 "seek_hole": false, 00:12:50.648 "seek_data": false, 00:12:50.648 "copy": true, 00:12:50.648 "nvme_iov_md": false 00:12:50.648 }, 00:12:50.648 "memory_domains": [ 00:12:50.648 { 00:12:50.648 "dma_device_id": "system", 00:12:50.648 "dma_device_type": 1 00:12:50.648 }, 00:12:50.648 { 00:12:50.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.648 "dma_device_type": 2 00:12:50.648 } 00:12:50.648 ], 00:12:50.648 "driver_specific": {} 00:12:50.648 } 00:12:50.648 ] 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:50.648 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:50.908 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:50.908 "name": "Existed_Raid", 00:12:50.908 "uuid": "705c921e-374e-49b7-80dc-6af6cbcdd805", 00:12:50.908 "strip_size_kb": 0, 00:12:50.908 "state": "online", 00:12:50.908 "raid_level": "raid1", 00:12:50.908 "superblock": true, 00:12:50.908 "num_base_bdevs": 2, 00:12:50.908 "num_base_bdevs_discovered": 2, 00:12:50.908 "num_base_bdevs_operational": 2, 00:12:50.908 "base_bdevs_list": [ 00:12:50.908 { 00:12:50.908 "name": "BaseBdev1", 00:12:50.908 "uuid": "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd", 00:12:50.908 "is_configured": true, 00:12:50.908 "data_offset": 2048, 00:12:50.908 "data_size": 63488 00:12:50.908 }, 00:12:50.908 { 00:12:50.908 "name": "BaseBdev2", 00:12:50.908 "uuid": "515b1d08-f8a0-4110-b356-6f25f8366e4e", 00:12:50.908 "is_configured": true, 00:12:50.908 "data_offset": 2048, 00:12:50.908 "data_size": 63488 00:12:50.908 } 00:12:50.908 ] 00:12:50.908 }' 00:12:50.908 13:21:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:50.908 13:21:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:51.477 [2024-07-25 13:21:32.218363] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:51.477 "name": "Existed_Raid", 00:12:51.477 "aliases": [ 00:12:51.477 "705c921e-374e-49b7-80dc-6af6cbcdd805" 00:12:51.477 ], 00:12:51.477 "product_name": "Raid Volume", 00:12:51.477 "block_size": 512, 00:12:51.477 "num_blocks": 63488, 00:12:51.477 "uuid": "705c921e-374e-49b7-80dc-6af6cbcdd805", 00:12:51.477 "assigned_rate_limits": { 00:12:51.477 "rw_ios_per_sec": 0, 00:12:51.477 "rw_mbytes_per_sec": 0, 00:12:51.477 "r_mbytes_per_sec": 0, 00:12:51.477 "w_mbytes_per_sec": 0 00:12:51.477 }, 00:12:51.477 "claimed": false, 00:12:51.477 "zoned": false, 00:12:51.477 "supported_io_types": { 00:12:51.477 "read": true, 00:12:51.477 "write": true, 00:12:51.477 "unmap": false, 00:12:51.477 "flush": false, 00:12:51.477 "reset": true, 00:12:51.477 "nvme_admin": false, 00:12:51.477 "nvme_io": false, 00:12:51.477 "nvme_io_md": false, 00:12:51.477 "write_zeroes": true, 00:12:51.477 "zcopy": false, 00:12:51.477 "get_zone_info": false, 00:12:51.477 "zone_management": false, 00:12:51.477 "zone_append": false, 00:12:51.477 "compare": false, 00:12:51.477 "compare_and_write": false, 00:12:51.477 "abort": false, 00:12:51.477 "seek_hole": false, 00:12:51.477 "seek_data": false, 00:12:51.477 "copy": false, 00:12:51.477 "nvme_iov_md": false 00:12:51.477 }, 00:12:51.477 "memory_domains": [ 00:12:51.477 { 00:12:51.477 "dma_device_id": "system", 00:12:51.477 "dma_device_type": 1 00:12:51.477 }, 00:12:51.477 { 00:12:51.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.477 "dma_device_type": 2 00:12:51.477 }, 00:12:51.477 { 00:12:51.477 "dma_device_id": "system", 00:12:51.477 "dma_device_type": 1 00:12:51.477 }, 00:12:51.477 { 00:12:51.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.477 "dma_device_type": 2 00:12:51.477 } 00:12:51.477 ], 00:12:51.477 "driver_specific": { 00:12:51.477 "raid": { 00:12:51.477 "uuid": "705c921e-374e-49b7-80dc-6af6cbcdd805", 00:12:51.477 "strip_size_kb": 0, 00:12:51.477 "state": "online", 00:12:51.477 "raid_level": "raid1", 00:12:51.477 "superblock": true, 00:12:51.477 "num_base_bdevs": 2, 00:12:51.477 "num_base_bdevs_discovered": 2, 00:12:51.477 "num_base_bdevs_operational": 2, 00:12:51.477 "base_bdevs_list": [ 00:12:51.477 { 00:12:51.477 "name": "BaseBdev1", 00:12:51.477 "uuid": "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd", 00:12:51.477 "is_configured": true, 00:12:51.477 "data_offset": 2048, 00:12:51.477 "data_size": 63488 00:12:51.477 }, 00:12:51.477 { 00:12:51.477 "name": "BaseBdev2", 00:12:51.477 "uuid": "515b1d08-f8a0-4110-b356-6f25f8366e4e", 00:12:51.477 "is_configured": true, 00:12:51.477 "data_offset": 2048, 00:12:51.477 "data_size": 63488 00:12:51.477 } 00:12:51.477 ] 00:12:51.477 } 00:12:51.477 } 00:12:51.477 }' 00:12:51.477 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:51.738 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:51.738 BaseBdev2' 00:12:51.738 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:51.738 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:51.738 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:51.738 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:51.738 "name": "BaseBdev1", 00:12:51.738 "aliases": [ 00:12:51.738 "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd" 00:12:51.738 ], 00:12:51.738 "product_name": "Malloc disk", 00:12:51.738 "block_size": 512, 00:12:51.738 "num_blocks": 65536, 00:12:51.738 "uuid": "a0e5fff5-a6c3-4331-9d8e-3d141b9737cd", 00:12:51.738 "assigned_rate_limits": { 00:12:51.738 "rw_ios_per_sec": 0, 00:12:51.738 "rw_mbytes_per_sec": 0, 00:12:51.738 "r_mbytes_per_sec": 0, 00:12:51.738 "w_mbytes_per_sec": 0 00:12:51.738 }, 00:12:51.738 "claimed": true, 00:12:51.738 "claim_type": "exclusive_write", 00:12:51.738 "zoned": false, 00:12:51.738 "supported_io_types": { 00:12:51.738 "read": true, 00:12:51.738 "write": true, 00:12:51.738 "unmap": true, 00:12:51.738 "flush": true, 00:12:51.738 "reset": true, 00:12:51.738 "nvme_admin": false, 00:12:51.738 "nvme_io": false, 00:12:51.738 "nvme_io_md": false, 00:12:51.738 "write_zeroes": true, 00:12:51.738 "zcopy": true, 00:12:51.738 "get_zone_info": false, 00:12:51.738 "zone_management": false, 00:12:51.738 "zone_append": false, 00:12:51.738 "compare": false, 00:12:51.738 "compare_and_write": false, 00:12:51.738 "abort": true, 00:12:51.738 "seek_hole": false, 00:12:51.738 "seek_data": false, 00:12:51.738 "copy": true, 00:12:51.738 "nvme_iov_md": false 00:12:51.738 }, 00:12:51.738 "memory_domains": [ 00:12:51.738 { 00:12:51.738 "dma_device_id": "system", 00:12:51.738 "dma_device_type": 1 00:12:51.738 }, 00:12:51.738 { 00:12:51.738 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.738 "dma_device_type": 2 00:12:51.738 } 00:12:51.738 ], 00:12:51.738 "driver_specific": {} 00:12:51.738 }' 00:12:51.738 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:51.998 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.259 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:52.259 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:52.259 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:52.259 13:21:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:52.259 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:52.259 "name": "BaseBdev2", 00:12:52.259 "aliases": [ 00:12:52.259 "515b1d08-f8a0-4110-b356-6f25f8366e4e" 00:12:52.259 ], 00:12:52.259 "product_name": "Malloc disk", 00:12:52.259 "block_size": 512, 00:12:52.259 "num_blocks": 65536, 00:12:52.259 "uuid": "515b1d08-f8a0-4110-b356-6f25f8366e4e", 00:12:52.259 "assigned_rate_limits": { 00:12:52.259 "rw_ios_per_sec": 0, 00:12:52.259 "rw_mbytes_per_sec": 0, 00:12:52.259 "r_mbytes_per_sec": 0, 00:12:52.259 "w_mbytes_per_sec": 0 00:12:52.259 }, 00:12:52.259 "claimed": true, 00:12:52.259 "claim_type": "exclusive_write", 00:12:52.259 "zoned": false, 00:12:52.259 "supported_io_types": { 00:12:52.259 "read": true, 00:12:52.259 "write": true, 00:12:52.259 "unmap": true, 00:12:52.259 "flush": true, 00:12:52.259 "reset": true, 00:12:52.259 "nvme_admin": false, 00:12:52.259 "nvme_io": false, 00:12:52.259 "nvme_io_md": false, 00:12:52.259 "write_zeroes": true, 00:12:52.259 "zcopy": true, 00:12:52.259 "get_zone_info": false, 00:12:52.259 "zone_management": false, 00:12:52.259 "zone_append": false, 00:12:52.259 "compare": false, 00:12:52.259 "compare_and_write": false, 00:12:52.259 "abort": true, 00:12:52.259 "seek_hole": false, 00:12:52.259 "seek_data": false, 00:12:52.259 "copy": true, 00:12:52.259 "nvme_iov_md": false 00:12:52.259 }, 00:12:52.259 "memory_domains": [ 00:12:52.259 { 00:12:52.259 "dma_device_id": "system", 00:12:52.259 "dma_device_type": 1 00:12:52.259 }, 00:12:52.259 { 00:12:52.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:52.259 "dma_device_type": 2 00:12:52.259 } 00:12:52.259 ], 00:12:52.259 "driver_specific": {} 00:12:52.259 }' 00:12:52.259 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.519 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.520 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:52.520 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.520 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.520 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:52.520 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.520 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.784 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:52.784 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.784 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.784 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:52.784 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:52.784 [2024-07-25 13:21:33.573625] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:53.043 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:53.043 "name": "Existed_Raid", 00:12:53.043 "uuid": "705c921e-374e-49b7-80dc-6af6cbcdd805", 00:12:53.043 "strip_size_kb": 0, 00:12:53.043 "state": "online", 00:12:53.043 "raid_level": "raid1", 00:12:53.043 "superblock": true, 00:12:53.043 "num_base_bdevs": 2, 00:12:53.043 "num_base_bdevs_discovered": 1, 00:12:53.043 "num_base_bdevs_operational": 1, 00:12:53.043 "base_bdevs_list": [ 00:12:53.043 { 00:12:53.043 "name": null, 00:12:53.044 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:53.044 "is_configured": false, 00:12:53.044 "data_offset": 2048, 00:12:53.044 "data_size": 63488 00:12:53.044 }, 00:12:53.044 { 00:12:53.044 "name": "BaseBdev2", 00:12:53.044 "uuid": "515b1d08-f8a0-4110-b356-6f25f8366e4e", 00:12:53.044 "is_configured": true, 00:12:53.044 "data_offset": 2048, 00:12:53.044 "data_size": 63488 00:12:53.044 } 00:12:53.044 ] 00:12:53.044 }' 00:12:53.044 13:21:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:53.044 13:21:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:53.614 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:53.614 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:53.614 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:53.614 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:53.874 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:53.874 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:53.874 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:54.134 [2024-07-25 13:21:34.704485] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:54.134 [2024-07-25 13:21:34.704545] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:54.134 [2024-07-25 13:21:34.710598] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:54.134 [2024-07-25 13:21:34.710623] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:54.134 [2024-07-25 13:21:34.710629] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fdeda0 name Existed_Raid, state offline 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 888290 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 888290 ']' 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 888290 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:12:54.134 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 888290 00:12:54.394 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:12:54.394 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:12:54.394 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 888290' 00:12:54.394 killing process with pid 888290 00:12:54.394 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 888290 00:12:54.394 [2024-07-25 13:21:34.968452] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:54.394 13:21:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 888290 00:12:54.394 [2024-07-25 13:21:34.969048] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:54.394 13:21:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:54.394 00:12:54.394 real 0m9.948s 00:12:54.394 user 0m18.157s 00:12:54.394 sys 0m1.487s 00:12:54.394 13:21:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:12:54.394 13:21:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:54.394 ************************************ 00:12:54.394 END TEST raid_state_function_test_sb 00:12:54.394 ************************************ 00:12:54.394 13:21:35 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:12:54.394 13:21:35 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:12:54.394 13:21:35 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:12:54.394 13:21:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:54.394 ************************************ 00:12:54.394 START TEST raid_superblock_test 00:12:54.394 ************************************ 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=890238 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 890238 /var/tmp/spdk-raid.sock 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 890238 ']' 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:54.394 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:12:54.394 13:21:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:54.655 [2024-07-25 13:21:35.222746] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:12:54.655 [2024-07-25 13:21:35.222803] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid890238 ] 00:12:54.655 [2024-07-25 13:21:35.313452] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:54.655 [2024-07-25 13:21:35.390639] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:12:54.655 [2024-07-25 13:21:35.433683] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:54.655 [2024-07-25 13:21:35.433722] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:55.595 malloc1 00:12:55.595 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:55.855 [2024-07-25 13:21:36.396471] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:55.855 [2024-07-25 13:21:36.396506] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:55.855 [2024-07-25 13:21:36.396518] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25379b0 00:12:55.855 [2024-07-25 13:21:36.396525] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:55.855 [2024-07-25 13:21:36.397797] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:55.855 [2024-07-25 13:21:36.397817] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:55.855 pt1 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:55.855 malloc2 00:12:55.855 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:56.115 [2024-07-25 13:21:36.779239] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:56.115 [2024-07-25 13:21:36.779270] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:56.115 [2024-07-25 13:21:36.779279] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2538db0 00:12:56.115 [2024-07-25 13:21:36.779286] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:56.115 [2024-07-25 13:21:36.780479] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:56.115 [2024-07-25 13:21:36.780498] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:56.115 pt2 00:12:56.115 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:12:56.115 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:12:56.115 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:12:56.376 [2024-07-25 13:21:36.967740] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:56.376 [2024-07-25 13:21:36.968711] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:56.376 [2024-07-25 13:21:36.968811] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x26db6b0 00:12:56.376 [2024-07-25 13:21:36.968819] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:12:56.376 [2024-07-25 13:21:36.968967] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2530ca0 00:12:56.376 [2024-07-25 13:21:36.969073] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26db6b0 00:12:56.376 [2024-07-25 13:21:36.969079] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26db6b0 00:12:56.376 [2024-07-25 13:21:36.969156] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:56.376 13:21:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:56.376 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:56.376 "name": "raid_bdev1", 00:12:56.376 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:12:56.376 "strip_size_kb": 0, 00:12:56.376 "state": "online", 00:12:56.376 "raid_level": "raid1", 00:12:56.376 "superblock": true, 00:12:56.376 "num_base_bdevs": 2, 00:12:56.376 "num_base_bdevs_discovered": 2, 00:12:56.376 "num_base_bdevs_operational": 2, 00:12:56.376 "base_bdevs_list": [ 00:12:56.376 { 00:12:56.376 "name": "pt1", 00:12:56.376 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:56.376 "is_configured": true, 00:12:56.376 "data_offset": 2048, 00:12:56.376 "data_size": 63488 00:12:56.376 }, 00:12:56.376 { 00:12:56.376 "name": "pt2", 00:12:56.376 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:56.376 "is_configured": true, 00:12:56.376 "data_offset": 2048, 00:12:56.376 "data_size": 63488 00:12:56.376 } 00:12:56.376 ] 00:12:56.376 }' 00:12:56.376 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:56.376 13:21:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:57.007 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:57.293 [2024-07-25 13:21:37.834112] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:57.293 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:57.293 "name": "raid_bdev1", 00:12:57.293 "aliases": [ 00:12:57.293 "cbd8a854-3f83-4796-a0b6-9998bb9aaf84" 00:12:57.293 ], 00:12:57.293 "product_name": "Raid Volume", 00:12:57.293 "block_size": 512, 00:12:57.293 "num_blocks": 63488, 00:12:57.293 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:12:57.293 "assigned_rate_limits": { 00:12:57.293 "rw_ios_per_sec": 0, 00:12:57.293 "rw_mbytes_per_sec": 0, 00:12:57.294 "r_mbytes_per_sec": 0, 00:12:57.294 "w_mbytes_per_sec": 0 00:12:57.294 }, 00:12:57.294 "claimed": false, 00:12:57.294 "zoned": false, 00:12:57.294 "supported_io_types": { 00:12:57.294 "read": true, 00:12:57.294 "write": true, 00:12:57.294 "unmap": false, 00:12:57.294 "flush": false, 00:12:57.294 "reset": true, 00:12:57.294 "nvme_admin": false, 00:12:57.294 "nvme_io": false, 00:12:57.294 "nvme_io_md": false, 00:12:57.294 "write_zeroes": true, 00:12:57.294 "zcopy": false, 00:12:57.294 "get_zone_info": false, 00:12:57.294 "zone_management": false, 00:12:57.294 "zone_append": false, 00:12:57.294 "compare": false, 00:12:57.294 "compare_and_write": false, 00:12:57.294 "abort": false, 00:12:57.294 "seek_hole": false, 00:12:57.294 "seek_data": false, 00:12:57.294 "copy": false, 00:12:57.294 "nvme_iov_md": false 00:12:57.294 }, 00:12:57.294 "memory_domains": [ 00:12:57.294 { 00:12:57.294 "dma_device_id": "system", 00:12:57.294 "dma_device_type": 1 00:12:57.294 }, 00:12:57.294 { 00:12:57.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.294 "dma_device_type": 2 00:12:57.294 }, 00:12:57.294 { 00:12:57.294 "dma_device_id": "system", 00:12:57.294 "dma_device_type": 1 00:12:57.294 }, 00:12:57.294 { 00:12:57.294 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.294 "dma_device_type": 2 00:12:57.294 } 00:12:57.294 ], 00:12:57.294 "driver_specific": { 00:12:57.294 "raid": { 00:12:57.294 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:12:57.294 "strip_size_kb": 0, 00:12:57.294 "state": "online", 00:12:57.294 "raid_level": "raid1", 00:12:57.294 "superblock": true, 00:12:57.294 "num_base_bdevs": 2, 00:12:57.294 "num_base_bdevs_discovered": 2, 00:12:57.294 "num_base_bdevs_operational": 2, 00:12:57.294 "base_bdevs_list": [ 00:12:57.294 { 00:12:57.294 "name": "pt1", 00:12:57.294 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:57.294 "is_configured": true, 00:12:57.294 "data_offset": 2048, 00:12:57.294 "data_size": 63488 00:12:57.294 }, 00:12:57.294 { 00:12:57.294 "name": "pt2", 00:12:57.294 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:57.294 "is_configured": true, 00:12:57.294 "data_offset": 2048, 00:12:57.294 "data_size": 63488 00:12:57.294 } 00:12:57.294 ] 00:12:57.294 } 00:12:57.294 } 00:12:57.294 }' 00:12:57.294 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:57.294 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:57.294 pt2' 00:12:57.294 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:57.294 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:57.294 13:21:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.554 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.554 "name": "pt1", 00:12:57.554 "aliases": [ 00:12:57.554 "00000000-0000-0000-0000-000000000001" 00:12:57.554 ], 00:12:57.554 "product_name": "passthru", 00:12:57.554 "block_size": 512, 00:12:57.554 "num_blocks": 65536, 00:12:57.554 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:57.554 "assigned_rate_limits": { 00:12:57.554 "rw_ios_per_sec": 0, 00:12:57.554 "rw_mbytes_per_sec": 0, 00:12:57.554 "r_mbytes_per_sec": 0, 00:12:57.554 "w_mbytes_per_sec": 0 00:12:57.554 }, 00:12:57.554 "claimed": true, 00:12:57.554 "claim_type": "exclusive_write", 00:12:57.554 "zoned": false, 00:12:57.554 "supported_io_types": { 00:12:57.554 "read": true, 00:12:57.555 "write": true, 00:12:57.555 "unmap": true, 00:12:57.555 "flush": true, 00:12:57.555 "reset": true, 00:12:57.555 "nvme_admin": false, 00:12:57.555 "nvme_io": false, 00:12:57.555 "nvme_io_md": false, 00:12:57.555 "write_zeroes": true, 00:12:57.555 "zcopy": true, 00:12:57.555 "get_zone_info": false, 00:12:57.555 "zone_management": false, 00:12:57.555 "zone_append": false, 00:12:57.555 "compare": false, 00:12:57.555 "compare_and_write": false, 00:12:57.555 "abort": true, 00:12:57.555 "seek_hole": false, 00:12:57.555 "seek_data": false, 00:12:57.555 "copy": true, 00:12:57.555 "nvme_iov_md": false 00:12:57.555 }, 00:12:57.555 "memory_domains": [ 00:12:57.555 { 00:12:57.555 "dma_device_id": "system", 00:12:57.555 "dma_device_type": 1 00:12:57.555 }, 00:12:57.555 { 00:12:57.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.555 "dma_device_type": 2 00:12:57.555 } 00:12:57.555 ], 00:12:57.555 "driver_specific": { 00:12:57.555 "passthru": { 00:12:57.555 "name": "pt1", 00:12:57.555 "base_bdev_name": "malloc1" 00:12:57.555 } 00:12:57.555 } 00:12:57.555 }' 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.555 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.815 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.815 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:57.815 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:57.815 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:57.815 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.815 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.815 "name": "pt2", 00:12:57.815 "aliases": [ 00:12:57.815 "00000000-0000-0000-0000-000000000002" 00:12:57.815 ], 00:12:57.815 "product_name": "passthru", 00:12:57.815 "block_size": 512, 00:12:57.815 "num_blocks": 65536, 00:12:57.815 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:57.815 "assigned_rate_limits": { 00:12:57.815 "rw_ios_per_sec": 0, 00:12:57.815 "rw_mbytes_per_sec": 0, 00:12:57.815 "r_mbytes_per_sec": 0, 00:12:57.815 "w_mbytes_per_sec": 0 00:12:57.815 }, 00:12:57.815 "claimed": true, 00:12:57.815 "claim_type": "exclusive_write", 00:12:57.815 "zoned": false, 00:12:57.815 "supported_io_types": { 00:12:57.815 "read": true, 00:12:57.815 "write": true, 00:12:57.815 "unmap": true, 00:12:57.815 "flush": true, 00:12:57.815 "reset": true, 00:12:57.815 "nvme_admin": false, 00:12:57.815 "nvme_io": false, 00:12:57.815 "nvme_io_md": false, 00:12:57.815 "write_zeroes": true, 00:12:57.815 "zcopy": true, 00:12:57.815 "get_zone_info": false, 00:12:57.815 "zone_management": false, 00:12:57.815 "zone_append": false, 00:12:57.815 "compare": false, 00:12:57.815 "compare_and_write": false, 00:12:57.815 "abort": true, 00:12:57.815 "seek_hole": false, 00:12:57.815 "seek_data": false, 00:12:57.815 "copy": true, 00:12:57.815 "nvme_iov_md": false 00:12:57.815 }, 00:12:57.815 "memory_domains": [ 00:12:57.815 { 00:12:57.815 "dma_device_id": "system", 00:12:57.815 "dma_device_type": 1 00:12:57.815 }, 00:12:57.815 { 00:12:57.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.815 "dma_device_type": 2 00:12:57.815 } 00:12:57.815 ], 00:12:57.815 "driver_specific": { 00:12:57.815 "passthru": { 00:12:57.815 "name": "pt2", 00:12:57.815 "base_bdev_name": "malloc2" 00:12:57.815 } 00:12:57.815 } 00:12:57.815 }' 00:12:57.815 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:58.075 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.334 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.334 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:58.334 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:58.334 13:21:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:12:58.594 [2024-07-25 13:21:39.133395] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:58.594 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=cbd8a854-3f83-4796-a0b6-9998bb9aaf84 00:12:58.594 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z cbd8a854-3f83-4796-a0b6-9998bb9aaf84 ']' 00:12:58.594 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:58.594 [2024-07-25 13:21:39.325692] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:58.594 [2024-07-25 13:21:39.325707] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:58.594 [2024-07-25 13:21:39.325747] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:58.594 [2024-07-25 13:21:39.325788] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:58.594 [2024-07-25 13:21:39.325795] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26db6b0 name raid_bdev1, state offline 00:12:58.594 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:58.594 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:12:58.854 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:12:58.854 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:12:58.854 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:58.854 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:59.115 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:12:59.115 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:59.375 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:12:59.375 13:21:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:12:59.375 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:12:59.636 [2024-07-25 13:21:40.296106] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:12:59.636 [2024-07-25 13:21:40.297172] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:12:59.636 [2024-07-25 13:21:40.297215] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:12:59.636 [2024-07-25 13:21:40.297244] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:12:59.636 [2024-07-25 13:21:40.297255] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:59.636 [2024-07-25 13:21:40.297260] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2537e50 name raid_bdev1, state configuring 00:12:59.636 request: 00:12:59.636 { 00:12:59.636 "name": "raid_bdev1", 00:12:59.636 "raid_level": "raid1", 00:12:59.636 "base_bdevs": [ 00:12:59.636 "malloc1", 00:12:59.636 "malloc2" 00:12:59.636 ], 00:12:59.636 "superblock": false, 00:12:59.636 "method": "bdev_raid_create", 00:12:59.636 "req_id": 1 00:12:59.636 } 00:12:59.636 Got JSON-RPC error response 00:12:59.636 response: 00:12:59.636 { 00:12:59.636 "code": -17, 00:12:59.636 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:12:59.636 } 00:12:59.636 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:12:59.636 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:12:59.636 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:12:59.636 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:12:59.636 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.636 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:12:59.897 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:12:59.897 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:12:59.897 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:59.897 [2024-07-25 13:21:40.681037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:59.897 [2024-07-25 13:21:40.681063] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:59.897 [2024-07-25 13:21:40.681075] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2537be0 00:12:59.897 [2024-07-25 13:21:40.681081] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:59.897 [2024-07-25 13:21:40.682356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:59.897 [2024-07-25 13:21:40.682377] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:59.897 [2024-07-25 13:21:40.682424] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:12:59.897 [2024-07-25 13:21:40.682450] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:59.897 pt1 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:00.157 "name": "raid_bdev1", 00:13:00.157 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:13:00.157 "strip_size_kb": 0, 00:13:00.157 "state": "configuring", 00:13:00.157 "raid_level": "raid1", 00:13:00.157 "superblock": true, 00:13:00.157 "num_base_bdevs": 2, 00:13:00.157 "num_base_bdevs_discovered": 1, 00:13:00.157 "num_base_bdevs_operational": 2, 00:13:00.157 "base_bdevs_list": [ 00:13:00.157 { 00:13:00.157 "name": "pt1", 00:13:00.157 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:00.157 "is_configured": true, 00:13:00.157 "data_offset": 2048, 00:13:00.157 "data_size": 63488 00:13:00.157 }, 00:13:00.157 { 00:13:00.157 "name": null, 00:13:00.157 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:00.157 "is_configured": false, 00:13:00.157 "data_offset": 2048, 00:13:00.157 "data_size": 63488 00:13:00.157 } 00:13:00.157 ] 00:13:00.157 }' 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:00.157 13:21:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:00.727 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:13:00.727 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:13:00.727 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:13:00.727 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:00.986 [2024-07-25 13:21:41.639463] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:00.986 [2024-07-25 13:21:41.639493] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:00.986 [2024-07-25 13:21:41.639503] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26db430 00:13:00.986 [2024-07-25 13:21:41.639509] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:00.986 [2024-07-25 13:21:41.639790] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:00.986 [2024-07-25 13:21:41.639803] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:00.986 [2024-07-25 13:21:41.639844] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:00.986 [2024-07-25 13:21:41.639857] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:00.986 [2024-07-25 13:21:41.639934] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x26d0100 00:13:00.986 [2024-07-25 13:21:41.639941] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:00.986 [2024-07-25 13:21:41.640077] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252fb30 00:13:00.986 [2024-07-25 13:21:41.640176] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26d0100 00:13:00.986 [2024-07-25 13:21:41.640186] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26d0100 00:13:00.986 [2024-07-25 13:21:41.640258] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:00.986 pt2 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.986 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:01.246 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.246 "name": "raid_bdev1", 00:13:01.246 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:13:01.246 "strip_size_kb": 0, 00:13:01.246 "state": "online", 00:13:01.246 "raid_level": "raid1", 00:13:01.246 "superblock": true, 00:13:01.246 "num_base_bdevs": 2, 00:13:01.246 "num_base_bdevs_discovered": 2, 00:13:01.246 "num_base_bdevs_operational": 2, 00:13:01.246 "base_bdevs_list": [ 00:13:01.246 { 00:13:01.246 "name": "pt1", 00:13:01.246 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:01.246 "is_configured": true, 00:13:01.246 "data_offset": 2048, 00:13:01.246 "data_size": 63488 00:13:01.246 }, 00:13:01.246 { 00:13:01.246 "name": "pt2", 00:13:01.246 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:01.246 "is_configured": true, 00:13:01.246 "data_offset": 2048, 00:13:01.246 "data_size": 63488 00:13:01.246 } 00:13:01.246 ] 00:13:01.246 }' 00:13:01.246 13:21:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.246 13:21:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:01.816 [2024-07-25 13:21:42.586182] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:01.816 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:01.816 "name": "raid_bdev1", 00:13:01.816 "aliases": [ 00:13:01.816 "cbd8a854-3f83-4796-a0b6-9998bb9aaf84" 00:13:01.816 ], 00:13:01.816 "product_name": "Raid Volume", 00:13:01.816 "block_size": 512, 00:13:01.816 "num_blocks": 63488, 00:13:01.816 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:13:01.816 "assigned_rate_limits": { 00:13:01.816 "rw_ios_per_sec": 0, 00:13:01.816 "rw_mbytes_per_sec": 0, 00:13:01.816 "r_mbytes_per_sec": 0, 00:13:01.816 "w_mbytes_per_sec": 0 00:13:01.816 }, 00:13:01.816 "claimed": false, 00:13:01.816 "zoned": false, 00:13:01.816 "supported_io_types": { 00:13:01.816 "read": true, 00:13:01.816 "write": true, 00:13:01.816 "unmap": false, 00:13:01.816 "flush": false, 00:13:01.816 "reset": true, 00:13:01.816 "nvme_admin": false, 00:13:01.816 "nvme_io": false, 00:13:01.816 "nvme_io_md": false, 00:13:01.816 "write_zeroes": true, 00:13:01.816 "zcopy": false, 00:13:01.816 "get_zone_info": false, 00:13:01.816 "zone_management": false, 00:13:01.816 "zone_append": false, 00:13:01.816 "compare": false, 00:13:01.816 "compare_and_write": false, 00:13:01.816 "abort": false, 00:13:01.816 "seek_hole": false, 00:13:01.816 "seek_data": false, 00:13:01.816 "copy": false, 00:13:01.816 "nvme_iov_md": false 00:13:01.816 }, 00:13:01.816 "memory_domains": [ 00:13:01.816 { 00:13:01.816 "dma_device_id": "system", 00:13:01.817 "dma_device_type": 1 00:13:01.817 }, 00:13:01.817 { 00:13:01.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.817 "dma_device_type": 2 00:13:01.817 }, 00:13:01.817 { 00:13:01.817 "dma_device_id": "system", 00:13:01.817 "dma_device_type": 1 00:13:01.817 }, 00:13:01.817 { 00:13:01.817 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:01.817 "dma_device_type": 2 00:13:01.817 } 00:13:01.817 ], 00:13:01.817 "driver_specific": { 00:13:01.817 "raid": { 00:13:01.817 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:13:01.817 "strip_size_kb": 0, 00:13:01.817 "state": "online", 00:13:01.817 "raid_level": "raid1", 00:13:01.817 "superblock": true, 00:13:01.817 "num_base_bdevs": 2, 00:13:01.817 "num_base_bdevs_discovered": 2, 00:13:01.817 "num_base_bdevs_operational": 2, 00:13:01.817 "base_bdevs_list": [ 00:13:01.817 { 00:13:01.817 "name": "pt1", 00:13:01.817 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:01.817 "is_configured": true, 00:13:01.817 "data_offset": 2048, 00:13:01.817 "data_size": 63488 00:13:01.817 }, 00:13:01.817 { 00:13:01.817 "name": "pt2", 00:13:01.817 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:01.817 "is_configured": true, 00:13:01.817 "data_offset": 2048, 00:13:01.817 "data_size": 63488 00:13:01.817 } 00:13:01.817 ] 00:13:01.817 } 00:13:01.817 } 00:13:01.817 }' 00:13:01.817 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:02.077 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:02.077 pt2' 00:13:02.077 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.077 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.077 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:02.077 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:02.077 "name": "pt1", 00:13:02.077 "aliases": [ 00:13:02.077 "00000000-0000-0000-0000-000000000001" 00:13:02.077 ], 00:13:02.077 "product_name": "passthru", 00:13:02.077 "block_size": 512, 00:13:02.077 "num_blocks": 65536, 00:13:02.077 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:02.077 "assigned_rate_limits": { 00:13:02.077 "rw_ios_per_sec": 0, 00:13:02.077 "rw_mbytes_per_sec": 0, 00:13:02.077 "r_mbytes_per_sec": 0, 00:13:02.077 "w_mbytes_per_sec": 0 00:13:02.077 }, 00:13:02.077 "claimed": true, 00:13:02.077 "claim_type": "exclusive_write", 00:13:02.077 "zoned": false, 00:13:02.077 "supported_io_types": { 00:13:02.077 "read": true, 00:13:02.077 "write": true, 00:13:02.077 "unmap": true, 00:13:02.077 "flush": true, 00:13:02.077 "reset": true, 00:13:02.077 "nvme_admin": false, 00:13:02.077 "nvme_io": false, 00:13:02.077 "nvme_io_md": false, 00:13:02.077 "write_zeroes": true, 00:13:02.077 "zcopy": true, 00:13:02.077 "get_zone_info": false, 00:13:02.077 "zone_management": false, 00:13:02.077 "zone_append": false, 00:13:02.077 "compare": false, 00:13:02.077 "compare_and_write": false, 00:13:02.077 "abort": true, 00:13:02.077 "seek_hole": false, 00:13:02.077 "seek_data": false, 00:13:02.077 "copy": true, 00:13:02.077 "nvme_iov_md": false 00:13:02.077 }, 00:13:02.077 "memory_domains": [ 00:13:02.077 { 00:13:02.077 "dma_device_id": "system", 00:13:02.077 "dma_device_type": 1 00:13:02.077 }, 00:13:02.077 { 00:13:02.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.077 "dma_device_type": 2 00:13:02.077 } 00:13:02.077 ], 00:13:02.077 "driver_specific": { 00:13:02.077 "passthru": { 00:13:02.077 "name": "pt1", 00:13:02.077 "base_bdev_name": "malloc1" 00:13:02.077 } 00:13:02.077 } 00:13:02.077 }' 00:13:02.077 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.336 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.336 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:02.336 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.336 13:21:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.336 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:02.336 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.336 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.336 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:02.336 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.596 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:02.596 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:02.596 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:02.596 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:02.596 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:02.855 "name": "pt2", 00:13:02.855 "aliases": [ 00:13:02.855 "00000000-0000-0000-0000-000000000002" 00:13:02.855 ], 00:13:02.855 "product_name": "passthru", 00:13:02.855 "block_size": 512, 00:13:02.855 "num_blocks": 65536, 00:13:02.855 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:02.855 "assigned_rate_limits": { 00:13:02.855 "rw_ios_per_sec": 0, 00:13:02.855 "rw_mbytes_per_sec": 0, 00:13:02.855 "r_mbytes_per_sec": 0, 00:13:02.855 "w_mbytes_per_sec": 0 00:13:02.855 }, 00:13:02.855 "claimed": true, 00:13:02.855 "claim_type": "exclusive_write", 00:13:02.855 "zoned": false, 00:13:02.855 "supported_io_types": { 00:13:02.855 "read": true, 00:13:02.855 "write": true, 00:13:02.855 "unmap": true, 00:13:02.855 "flush": true, 00:13:02.855 "reset": true, 00:13:02.855 "nvme_admin": false, 00:13:02.855 "nvme_io": false, 00:13:02.855 "nvme_io_md": false, 00:13:02.855 "write_zeroes": true, 00:13:02.855 "zcopy": true, 00:13:02.855 "get_zone_info": false, 00:13:02.855 "zone_management": false, 00:13:02.855 "zone_append": false, 00:13:02.855 "compare": false, 00:13:02.855 "compare_and_write": false, 00:13:02.855 "abort": true, 00:13:02.855 "seek_hole": false, 00:13:02.855 "seek_data": false, 00:13:02.855 "copy": true, 00:13:02.855 "nvme_iov_md": false 00:13:02.855 }, 00:13:02.855 "memory_domains": [ 00:13:02.855 { 00:13:02.855 "dma_device_id": "system", 00:13:02.855 "dma_device_type": 1 00:13:02.855 }, 00:13:02.855 { 00:13:02.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:02.855 "dma_device_type": 2 00:13:02.855 } 00:13:02.855 ], 00:13:02.855 "driver_specific": { 00:13:02.855 "passthru": { 00:13:02.855 "name": "pt2", 00:13:02.855 "base_bdev_name": "malloc2" 00:13:02.855 } 00:13:02.855 } 00:13:02.855 }' 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:02.855 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:03.113 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:03.113 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.113 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:03.113 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:03.113 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:03.113 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:13:03.372 [2024-07-25 13:21:43.929591] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:03.372 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' cbd8a854-3f83-4796-a0b6-9998bb9aaf84 '!=' cbd8a854-3f83-4796-a0b6-9998bb9aaf84 ']' 00:13:03.372 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:13:03.372 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:03.372 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:03.372 13:21:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:13:03.372 [2024-07-25 13:21:44.121899] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:03.372 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:03.631 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:03.631 "name": "raid_bdev1", 00:13:03.631 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:13:03.631 "strip_size_kb": 0, 00:13:03.631 "state": "online", 00:13:03.631 "raid_level": "raid1", 00:13:03.631 "superblock": true, 00:13:03.631 "num_base_bdevs": 2, 00:13:03.631 "num_base_bdevs_discovered": 1, 00:13:03.631 "num_base_bdevs_operational": 1, 00:13:03.631 "base_bdevs_list": [ 00:13:03.631 { 00:13:03.631 "name": null, 00:13:03.631 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:03.631 "is_configured": false, 00:13:03.631 "data_offset": 2048, 00:13:03.631 "data_size": 63488 00:13:03.631 }, 00:13:03.631 { 00:13:03.631 "name": "pt2", 00:13:03.631 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:03.631 "is_configured": true, 00:13:03.631 "data_offset": 2048, 00:13:03.631 "data_size": 63488 00:13:03.631 } 00:13:03.631 ] 00:13:03.631 }' 00:13:03.631 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:03.631 13:21:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.199 13:21:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:04.459 [2024-07-25 13:21:45.012132] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:04.459 [2024-07-25 13:21:45.012150] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:04.459 [2024-07-25 13:21:45.012185] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:04.459 [2024-07-25 13:21:45.012214] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:04.459 [2024-07-25 13:21:45.012221] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26d0100 name raid_bdev1, state offline 00:13:04.459 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.459 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:13:04.459 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:13:04.459 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:13:04.459 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:13:04.459 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:13:04.459 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:04.718 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:13:04.718 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:13:04.718 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:13:04.718 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:13:04.718 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=1 00:13:04.718 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:04.978 [2024-07-25 13:21:45.601605] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:04.978 [2024-07-25 13:21:45.601634] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:04.978 [2024-07-25 13:21:45.601644] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26cfa10 00:13:04.978 [2024-07-25 13:21:45.601650] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:04.978 [2024-07-25 13:21:45.602939] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:04.978 [2024-07-25 13:21:45.602960] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:04.978 [2024-07-25 13:21:45.603006] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:04.978 [2024-07-25 13:21:45.603025] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:04.978 [2024-07-25 13:21:45.603087] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x252f300 00:13:04.978 [2024-07-25 13:21:45.603093] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:04.978 [2024-07-25 13:21:45.603230] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2531650 00:13:04.978 [2024-07-25 13:21:45.603327] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x252f300 00:13:04.978 [2024-07-25 13:21:45.603333] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x252f300 00:13:04.978 [2024-07-25 13:21:45.603404] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.978 pt2 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.978 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:05.237 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:05.237 "name": "raid_bdev1", 00:13:05.237 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:13:05.237 "strip_size_kb": 0, 00:13:05.237 "state": "online", 00:13:05.237 "raid_level": "raid1", 00:13:05.237 "superblock": true, 00:13:05.237 "num_base_bdevs": 2, 00:13:05.237 "num_base_bdevs_discovered": 1, 00:13:05.237 "num_base_bdevs_operational": 1, 00:13:05.237 "base_bdevs_list": [ 00:13:05.237 { 00:13:05.237 "name": null, 00:13:05.237 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:05.237 "is_configured": false, 00:13:05.237 "data_offset": 2048, 00:13:05.237 "data_size": 63488 00:13:05.237 }, 00:13:05.237 { 00:13:05.237 "name": "pt2", 00:13:05.237 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:05.237 "is_configured": true, 00:13:05.237 "data_offset": 2048, 00:13:05.237 "data_size": 63488 00:13:05.237 } 00:13:05.237 ] 00:13:05.237 }' 00:13:05.237 13:21:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:05.237 13:21:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:05.805 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:05.805 [2024-07-25 13:21:46.543986] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:05.805 [2024-07-25 13:21:46.544002] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:05.805 [2024-07-25 13:21:46.544038] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:05.805 [2024-07-25 13:21:46.544068] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:05.805 [2024-07-25 13:21:46.544075] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x252f300 name raid_bdev1, state offline 00:13:05.805 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:05.805 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:13:06.064 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:13:06.064 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:13:06.064 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:13:06.064 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:06.325 [2024-07-25 13:21:46.936963] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:06.325 [2024-07-25 13:21:46.936991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:06.325 [2024-07-25 13:21:46.937002] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2537e50 00:13:06.325 [2024-07-25 13:21:46.937008] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:06.325 [2024-07-25 13:21:46.938293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:06.325 [2024-07-25 13:21:46.938315] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:06.325 [2024-07-25 13:21:46.938362] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:06.325 [2024-07-25 13:21:46.938381] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:06.325 [2024-07-25 13:21:46.938458] bdev_raid.c:3665:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:13:06.325 [2024-07-25 13:21:46.938466] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:06.325 [2024-07-25 13:21:46.938474] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2531cf0 name raid_bdev1, state configuring 00:13:06.325 [2024-07-25 13:21:46.938488] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:06.325 [2024-07-25 13:21:46.938525] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x25328a0 00:13:06.325 [2024-07-25 13:21:46.938531] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:06.325 [2024-07-25 13:21:46.938677] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26cfd30 00:13:06.325 [2024-07-25 13:21:46.938775] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25328a0 00:13:06.325 [2024-07-25 13:21:46.938780] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25328a0 00:13:06.325 [2024-07-25 13:21:46.938853] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:06.325 pt1 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:06.325 13:21:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:06.587 13:21:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:06.587 "name": "raid_bdev1", 00:13:06.587 "uuid": "cbd8a854-3f83-4796-a0b6-9998bb9aaf84", 00:13:06.587 "strip_size_kb": 0, 00:13:06.587 "state": "online", 00:13:06.587 "raid_level": "raid1", 00:13:06.587 "superblock": true, 00:13:06.587 "num_base_bdevs": 2, 00:13:06.587 "num_base_bdevs_discovered": 1, 00:13:06.587 "num_base_bdevs_operational": 1, 00:13:06.587 "base_bdevs_list": [ 00:13:06.587 { 00:13:06.587 "name": null, 00:13:06.587 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:06.587 "is_configured": false, 00:13:06.587 "data_offset": 2048, 00:13:06.587 "data_size": 63488 00:13:06.587 }, 00:13:06.587 { 00:13:06.587 "name": "pt2", 00:13:06.587 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:06.587 "is_configured": true, 00:13:06.587 "data_offset": 2048, 00:13:06.587 "data_size": 63488 00:13:06.587 } 00:13:06.587 ] 00:13:06.587 }' 00:13:06.587 13:21:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:06.587 13:21:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.155 13:21:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:13:07.155 13:21:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:13:07.155 13:21:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:13:07.155 13:21:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:07.155 13:21:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:13:07.415 [2024-07-25 13:21:48.076022] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' cbd8a854-3f83-4796-a0b6-9998bb9aaf84 '!=' cbd8a854-3f83-4796-a0b6-9998bb9aaf84 ']' 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 890238 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 890238 ']' 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 890238 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 890238 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 890238' 00:13:07.415 killing process with pid 890238 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 890238 00:13:07.415 [2024-07-25 13:21:48.146450] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:07.415 [2024-07-25 13:21:48.146489] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:07.415 [2024-07-25 13:21:48.146519] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:07.415 [2024-07-25 13:21:48.146529] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25328a0 name raid_bdev1, state offline 00:13:07.415 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 890238 00:13:07.415 [2024-07-25 13:21:48.155770] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:07.675 13:21:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:13:07.675 00:13:07.675 real 0m13.110s 00:13:07.675 user 0m24.215s 00:13:07.675 sys 0m2.044s 00:13:07.675 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:07.675 13:21:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.675 ************************************ 00:13:07.675 END TEST raid_superblock_test 00:13:07.675 ************************************ 00:13:07.675 13:21:48 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:13:07.675 13:21:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:07.675 13:21:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:07.675 13:21:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:07.675 ************************************ 00:13:07.675 START TEST raid_read_error_test 00:13:07.675 ************************************ 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 read 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:13:07.675 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.MkElEHljIw 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=892818 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 892818 /var/tmp/spdk-raid.sock 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 892818 ']' 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:07.676 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:07.676 13:21:48 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.676 [2024-07-25 13:21:48.421766] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:13:07.676 [2024-07-25 13:21:48.421823] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid892818 ] 00:13:07.935 [2024-07-25 13:21:48.511788] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:07.935 [2024-07-25 13:21:48.579456] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:07.935 [2024-07-25 13:21:48.620804] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.935 [2024-07-25 13:21:48.620828] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.505 13:21:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:08.505 13:21:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:08.505 13:21:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:08.505 13:21:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:08.766 BaseBdev1_malloc 00:13:08.766 13:21:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:09.028 true 00:13:09.028 13:21:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:09.028 [2024-07-25 13:21:49.819385] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:09.028 [2024-07-25 13:21:49.819417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.028 [2024-07-25 13:21:49.819429] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15e02a0 00:13:09.028 [2024-07-25 13:21:49.819435] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.289 [2024-07-25 13:21:49.820732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.289 [2024-07-25 13:21:49.820753] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:09.289 BaseBdev1 00:13:09.289 13:21:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:09.289 13:21:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:09.289 BaseBdev2_malloc 00:13:09.289 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:09.549 true 00:13:09.549 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:09.808 [2024-07-25 13:21:50.410940] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:09.808 [2024-07-25 13:21:50.410974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.808 [2024-07-25 13:21:50.410987] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x169f420 00:13:09.808 [2024-07-25 13:21:50.410993] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.808 [2024-07-25 13:21:50.412191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.808 [2024-07-25 13:21:50.412211] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:09.808 BaseBdev2 00:13:09.808 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:09.808 [2024-07-25 13:21:50.599434] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:10.066 [2024-07-25 13:21:50.600457] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:10.066 [2024-07-25 13:21:50.600594] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x169f6c0 00:13:10.066 [2024-07-25 13:21:50.600603] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:10.066 [2024-07-25 13:21:50.600754] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16a47c0 00:13:10.066 [2024-07-25 13:21:50.600870] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x169f6c0 00:13:10.066 [2024-07-25 13:21:50.600875] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x169f6c0 00:13:10.066 [2024-07-25 13:21:50.600962] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:10.066 "name": "raid_bdev1", 00:13:10.066 "uuid": "c06e1530-97f7-4fdc-993c-973894dab809", 00:13:10.066 "strip_size_kb": 0, 00:13:10.066 "state": "online", 00:13:10.066 "raid_level": "raid1", 00:13:10.066 "superblock": true, 00:13:10.066 "num_base_bdevs": 2, 00:13:10.066 "num_base_bdevs_discovered": 2, 00:13:10.066 "num_base_bdevs_operational": 2, 00:13:10.066 "base_bdevs_list": [ 00:13:10.066 { 00:13:10.066 "name": "BaseBdev1", 00:13:10.066 "uuid": "71db6206-66cc-5327-9659-f4a2b69be230", 00:13:10.066 "is_configured": true, 00:13:10.066 "data_offset": 2048, 00:13:10.066 "data_size": 63488 00:13:10.066 }, 00:13:10.066 { 00:13:10.066 "name": "BaseBdev2", 00:13:10.066 "uuid": "441a5430-d11c-536a-9a73-89df6a637fe1", 00:13:10.066 "is_configured": true, 00:13:10.066 "data_offset": 2048, 00:13:10.066 "data_size": 63488 00:13:10.066 } 00:13:10.066 ] 00:13:10.066 }' 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:10.066 13:21:50 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:10.634 13:21:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:13:10.634 13:21:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:10.894 [2024-07-25 13:21:51.489949] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x15df090 00:13:11.832 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=2 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:12.401 13:21:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:12.401 13:21:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:12.401 "name": "raid_bdev1", 00:13:12.401 "uuid": "c06e1530-97f7-4fdc-993c-973894dab809", 00:13:12.401 "strip_size_kb": 0, 00:13:12.401 "state": "online", 00:13:12.401 "raid_level": "raid1", 00:13:12.401 "superblock": true, 00:13:12.401 "num_base_bdevs": 2, 00:13:12.401 "num_base_bdevs_discovered": 2, 00:13:12.401 "num_base_bdevs_operational": 2, 00:13:12.401 "base_bdevs_list": [ 00:13:12.401 { 00:13:12.401 "name": "BaseBdev1", 00:13:12.401 "uuid": "71db6206-66cc-5327-9659-f4a2b69be230", 00:13:12.401 "is_configured": true, 00:13:12.401 "data_offset": 2048, 00:13:12.401 "data_size": 63488 00:13:12.401 }, 00:13:12.401 { 00:13:12.401 "name": "BaseBdev2", 00:13:12.401 "uuid": "441a5430-d11c-536a-9a73-89df6a637fe1", 00:13:12.401 "is_configured": true, 00:13:12.401 "data_offset": 2048, 00:13:12.401 "data_size": 63488 00:13:12.401 } 00:13:12.401 ] 00:13:12.401 }' 00:13:12.401 13:21:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:12.401 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:12.971 13:21:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:13.231 [2024-07-25 13:21:53.913815] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:13.231 [2024-07-25 13:21:53.913843] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:13.231 [2024-07-25 13:21:53.916416] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:13.231 [2024-07-25 13:21:53.916437] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:13.231 [2024-07-25 13:21:53.916494] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:13.231 [2024-07-25 13:21:53.916501] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169f6c0 name raid_bdev1, state offline 00:13:13.231 0 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 892818 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 892818 ']' 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 892818 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 892818 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 892818' 00:13:13.231 killing process with pid 892818 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 892818 00:13:13.231 [2024-07-25 13:21:53.982619] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:13.231 13:21:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 892818 00:13:13.231 [2024-07-25 13:21:53.988229] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:13.490 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.MkElEHljIw 00:13:13.490 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:13.490 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:13.491 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:13:13.491 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:13:13.491 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:13.491 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:13.491 13:21:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:13.491 00:13:13.491 real 0m5.765s 00:13:13.491 user 0m9.208s 00:13:13.491 sys 0m0.804s 00:13:13.491 13:21:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:13.491 13:21:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.491 ************************************ 00:13:13.491 END TEST raid_read_error_test 00:13:13.491 ************************************ 00:13:13.491 13:21:54 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:13:13.491 13:21:54 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:13.491 13:21:54 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:13.491 13:21:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:13.491 ************************************ 00:13:13.491 START TEST raid_write_error_test 00:13:13.491 ************************************ 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 2 write 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=2 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.OTfn5AmxEu 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=893838 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 893838 /var/tmp/spdk-raid.sock 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 893838 ']' 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:13.491 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:13.491 13:21:54 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:13.491 [2024-07-25 13:21:54.263176] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:13:13.491 [2024-07-25 13:21:54.263222] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid893838 ] 00:13:13.750 [2024-07-25 13:21:54.350886] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:13.750 [2024-07-25 13:21:54.413731] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:13.750 [2024-07-25 13:21:54.457750] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:13.750 [2024-07-25 13:21:54.457773] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:14.318 13:21:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:14.318 13:21:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:13:14.318 13:21:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:14.318 13:21:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:14.578 BaseBdev1_malloc 00:13:14.578 13:21:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:14.838 true 00:13:14.838 13:21:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:15.098 [2024-07-25 13:21:55.660543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:15.098 [2024-07-25 13:21:55.660578] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:15.098 [2024-07-25 13:21:55.660588] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246b2a0 00:13:15.098 [2024-07-25 13:21:55.660594] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:15.098 [2024-07-25 13:21:55.661902] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:15.098 [2024-07-25 13:21:55.661922] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:15.098 BaseBdev1 00:13:15.098 13:21:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:13:15.098 13:21:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:15.098 BaseBdev2_malloc 00:13:15.098 13:21:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:15.358 true 00:13:15.358 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:15.618 [2024-07-25 13:21:56.227917] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:15.618 [2024-07-25 13:21:56.227947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:15.618 [2024-07-25 13:21:56.227958] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x252a420 00:13:15.618 [2024-07-25 13:21:56.227965] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:15.618 [2024-07-25 13:21:56.229172] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:15.618 [2024-07-25 13:21:56.229191] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:15.618 BaseBdev2 00:13:15.618 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:13:15.878 [2024-07-25 13:21:56.416411] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:15.878 [2024-07-25 13:21:56.417437] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:15.878 [2024-07-25 13:21:56.417574] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x252a6c0 00:13:15.878 [2024-07-25 13:21:56.417583] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:13:15.878 [2024-07-25 13:21:56.417732] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x252f7c0 00:13:15.878 [2024-07-25 13:21:56.417845] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x252a6c0 00:13:15.878 [2024-07-25 13:21:56.417851] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x252a6c0 00:13:15.878 [2024-07-25 13:21:56.417939] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:15.878 "name": "raid_bdev1", 00:13:15.878 "uuid": "bdfa42af-1991-489f-a261-dc9f2be8438b", 00:13:15.878 "strip_size_kb": 0, 00:13:15.878 "state": "online", 00:13:15.878 "raid_level": "raid1", 00:13:15.878 "superblock": true, 00:13:15.878 "num_base_bdevs": 2, 00:13:15.878 "num_base_bdevs_discovered": 2, 00:13:15.878 "num_base_bdevs_operational": 2, 00:13:15.878 "base_bdevs_list": [ 00:13:15.878 { 00:13:15.878 "name": "BaseBdev1", 00:13:15.878 "uuid": "af15ee03-e8c9-5b73-a954-d8e43251617f", 00:13:15.878 "is_configured": true, 00:13:15.878 "data_offset": 2048, 00:13:15.878 "data_size": 63488 00:13:15.878 }, 00:13:15.878 { 00:13:15.878 "name": "BaseBdev2", 00:13:15.878 "uuid": "b2e9b347-4987-57fb-97ca-c24d552229f0", 00:13:15.878 "is_configured": true, 00:13:15.878 "data_offset": 2048, 00:13:15.878 "data_size": 63488 00:13:15.878 } 00:13:15.878 ] 00:13:15.878 }' 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:15.878 13:21:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:16.448 13:21:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:13:16.448 13:21:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:16.708 [2024-07-25 13:21:57.278839] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x246a090 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:17.648 [2024-07-25 13:21:58.367666] bdev_raid.c:2263:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:13:17.648 [2024-07-25 13:21:58.367713] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:17.648 [2024-07-25 13:21:58.367873] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x246a090 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=1 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:17.648 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:17.649 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:17.649 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:17.909 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:17.909 "name": "raid_bdev1", 00:13:17.909 "uuid": "bdfa42af-1991-489f-a261-dc9f2be8438b", 00:13:17.909 "strip_size_kb": 0, 00:13:17.909 "state": "online", 00:13:17.909 "raid_level": "raid1", 00:13:17.909 "superblock": true, 00:13:17.909 "num_base_bdevs": 2, 00:13:17.909 "num_base_bdevs_discovered": 1, 00:13:17.909 "num_base_bdevs_operational": 1, 00:13:17.909 "base_bdevs_list": [ 00:13:17.909 { 00:13:17.909 "name": null, 00:13:17.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:17.909 "is_configured": false, 00:13:17.909 "data_offset": 2048, 00:13:17.909 "data_size": 63488 00:13:17.909 }, 00:13:17.909 { 00:13:17.909 "name": "BaseBdev2", 00:13:17.909 "uuid": "b2e9b347-4987-57fb-97ca-c24d552229f0", 00:13:17.909 "is_configured": true, 00:13:17.909 "data_offset": 2048, 00:13:17.909 "data_size": 63488 00:13:17.909 } 00:13:17.909 ] 00:13:17.909 }' 00:13:17.909 13:21:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:17.909 13:21:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.478 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:18.738 [2024-07-25 13:21:59.329680] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:18.738 [2024-07-25 13:21:59.329708] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:18.738 [2024-07-25 13:21:59.332303] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:18.738 [2024-07-25 13:21:59.332321] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:18.738 [2024-07-25 13:21:59.332359] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:18.738 [2024-07-25 13:21:59.332365] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x252a6c0 name raid_bdev1, state offline 00:13:18.738 0 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 893838 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 893838 ']' 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 893838 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 893838 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 893838' 00:13:18.738 killing process with pid 893838 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 893838 00:13:18.738 [2024-07-25 13:21:59.419193] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:18.738 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 893838 00:13:18.738 [2024-07-25 13:21:59.424446] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.OTfn5AmxEu 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:13:18.999 00:13:18.999 real 0m5.366s 00:13:18.999 user 0m8.436s 00:13:18.999 sys 0m0.759s 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:18.999 13:21:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:18.999 ************************************ 00:13:18.999 END TEST raid_write_error_test 00:13:18.999 ************************************ 00:13:18.999 13:21:59 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:13:18.999 13:21:59 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:13:18.999 13:21:59 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:13:18.999 13:21:59 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:18.999 13:21:59 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:18.999 13:21:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:18.999 ************************************ 00:13:18.999 START TEST raid_state_function_test 00:13:18.999 ************************************ 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 false 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=894850 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 894850' 00:13:18.999 Process raid pid: 894850 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 894850 /var/tmp/spdk-raid.sock 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 894850 ']' 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:18.999 13:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:18.999 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:19.000 13:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:19.000 13:21:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.000 [2024-07-25 13:21:59.698421] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:13:19.000 [2024-07-25 13:21:59.698472] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:19.259 [2024-07-25 13:21:59.792349] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:19.259 [2024-07-25 13:21:59.869109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:19.259 [2024-07-25 13:21:59.913443] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:19.259 [2024-07-25 13:21:59.913472] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:19.826 13:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:19.826 13:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:13:19.826 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:20.085 [2024-07-25 13:22:00.733575] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:20.085 [2024-07-25 13:22:00.733609] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:20.085 [2024-07-25 13:22:00.733616] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:20.085 [2024-07-25 13:22:00.733622] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:20.085 [2024-07-25 13:22:00.733626] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:20.085 [2024-07-25 13:22:00.733632] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.085 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:20.351 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.351 "name": "Existed_Raid", 00:13:20.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.351 "strip_size_kb": 64, 00:13:20.351 "state": "configuring", 00:13:20.351 "raid_level": "raid0", 00:13:20.351 "superblock": false, 00:13:20.351 "num_base_bdevs": 3, 00:13:20.351 "num_base_bdevs_discovered": 0, 00:13:20.351 "num_base_bdevs_operational": 3, 00:13:20.351 "base_bdevs_list": [ 00:13:20.351 { 00:13:20.351 "name": "BaseBdev1", 00:13:20.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.351 "is_configured": false, 00:13:20.351 "data_offset": 0, 00:13:20.351 "data_size": 0 00:13:20.351 }, 00:13:20.351 { 00:13:20.351 "name": "BaseBdev2", 00:13:20.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.351 "is_configured": false, 00:13:20.351 "data_offset": 0, 00:13:20.351 "data_size": 0 00:13:20.351 }, 00:13:20.351 { 00:13:20.351 "name": "BaseBdev3", 00:13:20.351 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:20.351 "is_configured": false, 00:13:20.351 "data_offset": 0, 00:13:20.351 "data_size": 0 00:13:20.351 } 00:13:20.351 ] 00:13:20.351 }' 00:13:20.351 13:22:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.351 13:22:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:20.960 13:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:20.960 [2024-07-25 13:22:01.687869] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:20.960 [2024-07-25 13:22:01.687893] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0b6d0 name Existed_Raid, state configuring 00:13:20.961 13:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:21.220 [2024-07-25 13:22:01.900425] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:21.220 [2024-07-25 13:22:01.900446] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:21.220 [2024-07-25 13:22:01.900451] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:21.220 [2024-07-25 13:22:01.900456] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:21.220 [2024-07-25 13:22:01.900462] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:21.220 [2024-07-25 13:22:01.900467] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:21.220 13:22:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:21.479 [2024-07-25 13:22:02.115418] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:21.480 BaseBdev1 00:13:21.480 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:21.480 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:21.480 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:21.480 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:21.480 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:21.480 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:21.480 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:21.739 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:21.739 [ 00:13:21.739 { 00:13:21.739 "name": "BaseBdev1", 00:13:21.739 "aliases": [ 00:13:21.739 "e6ebeca6-b14f-49a6-8eeb-1085c5f48314" 00:13:21.739 ], 00:13:21.739 "product_name": "Malloc disk", 00:13:21.739 "block_size": 512, 00:13:21.739 "num_blocks": 65536, 00:13:21.739 "uuid": "e6ebeca6-b14f-49a6-8eeb-1085c5f48314", 00:13:21.739 "assigned_rate_limits": { 00:13:21.739 "rw_ios_per_sec": 0, 00:13:21.739 "rw_mbytes_per_sec": 0, 00:13:21.739 "r_mbytes_per_sec": 0, 00:13:21.739 "w_mbytes_per_sec": 0 00:13:21.739 }, 00:13:21.739 "claimed": true, 00:13:21.739 "claim_type": "exclusive_write", 00:13:21.739 "zoned": false, 00:13:21.739 "supported_io_types": { 00:13:21.739 "read": true, 00:13:21.739 "write": true, 00:13:21.739 "unmap": true, 00:13:21.739 "flush": true, 00:13:21.739 "reset": true, 00:13:21.739 "nvme_admin": false, 00:13:21.739 "nvme_io": false, 00:13:21.739 "nvme_io_md": false, 00:13:21.739 "write_zeroes": true, 00:13:21.739 "zcopy": true, 00:13:21.739 "get_zone_info": false, 00:13:21.739 "zone_management": false, 00:13:21.739 "zone_append": false, 00:13:21.739 "compare": false, 00:13:21.739 "compare_and_write": false, 00:13:21.739 "abort": true, 00:13:21.739 "seek_hole": false, 00:13:21.739 "seek_data": false, 00:13:21.739 "copy": true, 00:13:21.739 "nvme_iov_md": false 00:13:21.739 }, 00:13:21.739 "memory_domains": [ 00:13:21.739 { 00:13:21.739 "dma_device_id": "system", 00:13:21.739 "dma_device_type": 1 00:13:21.739 }, 00:13:21.739 { 00:13:21.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:21.739 "dma_device_type": 2 00:13:21.739 } 00:13:21.739 ], 00:13:21.739 "driver_specific": {} 00:13:21.739 } 00:13:21.739 ] 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:21.999 "name": "Existed_Raid", 00:13:21.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.999 "strip_size_kb": 64, 00:13:21.999 "state": "configuring", 00:13:21.999 "raid_level": "raid0", 00:13:21.999 "superblock": false, 00:13:21.999 "num_base_bdevs": 3, 00:13:21.999 "num_base_bdevs_discovered": 1, 00:13:21.999 "num_base_bdevs_operational": 3, 00:13:21.999 "base_bdevs_list": [ 00:13:21.999 { 00:13:21.999 "name": "BaseBdev1", 00:13:21.999 "uuid": "e6ebeca6-b14f-49a6-8eeb-1085c5f48314", 00:13:21.999 "is_configured": true, 00:13:21.999 "data_offset": 0, 00:13:21.999 "data_size": 65536 00:13:21.999 }, 00:13:21.999 { 00:13:21.999 "name": "BaseBdev2", 00:13:21.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.999 "is_configured": false, 00:13:21.999 "data_offset": 0, 00:13:21.999 "data_size": 0 00:13:21.999 }, 00:13:21.999 { 00:13:21.999 "name": "BaseBdev3", 00:13:21.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:21.999 "is_configured": false, 00:13:21.999 "data_offset": 0, 00:13:21.999 "data_size": 0 00:13:21.999 } 00:13:21.999 ] 00:13:21.999 }' 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:21.999 13:22:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:22.569 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:22.828 [2024-07-25 13:22:03.470838] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:22.828 [2024-07-25 13:22:03.470867] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0afa0 name Existed_Raid, state configuring 00:13:22.828 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:23.088 [2024-07-25 13:22:03.667364] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:23.088 [2024-07-25 13:22:03.668484] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:23.088 [2024-07-25 13:22:03.668511] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:23.088 [2024-07-25 13:22:03.668517] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:23.088 [2024-07-25 13:22:03.668523] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.088 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:23.347 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.347 "name": "Existed_Raid", 00:13:23.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.347 "strip_size_kb": 64, 00:13:23.347 "state": "configuring", 00:13:23.347 "raid_level": "raid0", 00:13:23.347 "superblock": false, 00:13:23.347 "num_base_bdevs": 3, 00:13:23.347 "num_base_bdevs_discovered": 1, 00:13:23.347 "num_base_bdevs_operational": 3, 00:13:23.347 "base_bdevs_list": [ 00:13:23.347 { 00:13:23.347 "name": "BaseBdev1", 00:13:23.347 "uuid": "e6ebeca6-b14f-49a6-8eeb-1085c5f48314", 00:13:23.347 "is_configured": true, 00:13:23.347 "data_offset": 0, 00:13:23.347 "data_size": 65536 00:13:23.347 }, 00:13:23.347 { 00:13:23.347 "name": "BaseBdev2", 00:13:23.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.347 "is_configured": false, 00:13:23.347 "data_offset": 0, 00:13:23.347 "data_size": 0 00:13:23.347 }, 00:13:23.347 { 00:13:23.347 "name": "BaseBdev3", 00:13:23.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.347 "is_configured": false, 00:13:23.347 "data_offset": 0, 00:13:23.348 "data_size": 0 00:13:23.348 } 00:13:23.348 ] 00:13:23.348 }' 00:13:23.348 13:22:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.348 13:22:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:23.915 [2024-07-25 13:22:04.650594] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:23.915 BaseBdev2 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:23.915 13:22:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:24.175 13:22:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:24.434 [ 00:13:24.434 { 00:13:24.434 "name": "BaseBdev2", 00:13:24.434 "aliases": [ 00:13:24.434 "72ad0b68-3862-4fb2-ab40-64866ed2fb9d" 00:13:24.434 ], 00:13:24.434 "product_name": "Malloc disk", 00:13:24.434 "block_size": 512, 00:13:24.434 "num_blocks": 65536, 00:13:24.434 "uuid": "72ad0b68-3862-4fb2-ab40-64866ed2fb9d", 00:13:24.434 "assigned_rate_limits": { 00:13:24.434 "rw_ios_per_sec": 0, 00:13:24.434 "rw_mbytes_per_sec": 0, 00:13:24.434 "r_mbytes_per_sec": 0, 00:13:24.434 "w_mbytes_per_sec": 0 00:13:24.434 }, 00:13:24.434 "claimed": true, 00:13:24.434 "claim_type": "exclusive_write", 00:13:24.434 "zoned": false, 00:13:24.434 "supported_io_types": { 00:13:24.434 "read": true, 00:13:24.434 "write": true, 00:13:24.434 "unmap": true, 00:13:24.434 "flush": true, 00:13:24.434 "reset": true, 00:13:24.434 "nvme_admin": false, 00:13:24.434 "nvme_io": false, 00:13:24.434 "nvme_io_md": false, 00:13:24.434 "write_zeroes": true, 00:13:24.434 "zcopy": true, 00:13:24.434 "get_zone_info": false, 00:13:24.434 "zone_management": false, 00:13:24.434 "zone_append": false, 00:13:24.434 "compare": false, 00:13:24.434 "compare_and_write": false, 00:13:24.434 "abort": true, 00:13:24.434 "seek_hole": false, 00:13:24.434 "seek_data": false, 00:13:24.434 "copy": true, 00:13:24.434 "nvme_iov_md": false 00:13:24.434 }, 00:13:24.434 "memory_domains": [ 00:13:24.434 { 00:13:24.434 "dma_device_id": "system", 00:13:24.434 "dma_device_type": 1 00:13:24.434 }, 00:13:24.434 { 00:13:24.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:24.434 "dma_device_type": 2 00:13:24.434 } 00:13:24.434 ], 00:13:24.434 "driver_specific": {} 00:13:24.434 } 00:13:24.434 ] 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:24.434 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:24.693 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:24.693 "name": "Existed_Raid", 00:13:24.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.693 "strip_size_kb": 64, 00:13:24.693 "state": "configuring", 00:13:24.693 "raid_level": "raid0", 00:13:24.694 "superblock": false, 00:13:24.694 "num_base_bdevs": 3, 00:13:24.694 "num_base_bdevs_discovered": 2, 00:13:24.694 "num_base_bdevs_operational": 3, 00:13:24.694 "base_bdevs_list": [ 00:13:24.694 { 00:13:24.694 "name": "BaseBdev1", 00:13:24.694 "uuid": "e6ebeca6-b14f-49a6-8eeb-1085c5f48314", 00:13:24.694 "is_configured": true, 00:13:24.694 "data_offset": 0, 00:13:24.694 "data_size": 65536 00:13:24.694 }, 00:13:24.694 { 00:13:24.694 "name": "BaseBdev2", 00:13:24.694 "uuid": "72ad0b68-3862-4fb2-ab40-64866ed2fb9d", 00:13:24.694 "is_configured": true, 00:13:24.694 "data_offset": 0, 00:13:24.694 "data_size": 65536 00:13:24.694 }, 00:13:24.694 { 00:13:24.694 "name": "BaseBdev3", 00:13:24.694 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:24.694 "is_configured": false, 00:13:24.694 "data_offset": 0, 00:13:24.694 "data_size": 0 00:13:24.694 } 00:13:24.694 ] 00:13:24.694 }' 00:13:24.694 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:24.694 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:25.264 [2024-07-25 13:22:05.983085] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:25.264 [2024-07-25 13:22:05.983117] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f0bea0 00:13:25.264 [2024-07-25 13:22:05.983121] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:25.264 [2024-07-25 13:22:05.983297] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f0bb70 00:13:25.264 [2024-07-25 13:22:05.983390] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f0bea0 00:13:25.264 [2024-07-25 13:22:05.983396] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f0bea0 00:13:25.264 [2024-07-25 13:22:05.983513] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:25.264 BaseBdev3 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:25.264 13:22:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:25.523 13:22:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:25.783 [ 00:13:25.783 { 00:13:25.783 "name": "BaseBdev3", 00:13:25.783 "aliases": [ 00:13:25.783 "f93cc328-5ab6-4308-abf0-e2d731619c26" 00:13:25.783 ], 00:13:25.783 "product_name": "Malloc disk", 00:13:25.783 "block_size": 512, 00:13:25.784 "num_blocks": 65536, 00:13:25.784 "uuid": "f93cc328-5ab6-4308-abf0-e2d731619c26", 00:13:25.784 "assigned_rate_limits": { 00:13:25.784 "rw_ios_per_sec": 0, 00:13:25.784 "rw_mbytes_per_sec": 0, 00:13:25.784 "r_mbytes_per_sec": 0, 00:13:25.784 "w_mbytes_per_sec": 0 00:13:25.784 }, 00:13:25.784 "claimed": true, 00:13:25.784 "claim_type": "exclusive_write", 00:13:25.784 "zoned": false, 00:13:25.784 "supported_io_types": { 00:13:25.784 "read": true, 00:13:25.784 "write": true, 00:13:25.784 "unmap": true, 00:13:25.784 "flush": true, 00:13:25.784 "reset": true, 00:13:25.784 "nvme_admin": false, 00:13:25.784 "nvme_io": false, 00:13:25.784 "nvme_io_md": false, 00:13:25.784 "write_zeroes": true, 00:13:25.784 "zcopy": true, 00:13:25.784 "get_zone_info": false, 00:13:25.784 "zone_management": false, 00:13:25.784 "zone_append": false, 00:13:25.784 "compare": false, 00:13:25.784 "compare_and_write": false, 00:13:25.784 "abort": true, 00:13:25.784 "seek_hole": false, 00:13:25.784 "seek_data": false, 00:13:25.784 "copy": true, 00:13:25.784 "nvme_iov_md": false 00:13:25.784 }, 00:13:25.784 "memory_domains": [ 00:13:25.784 { 00:13:25.784 "dma_device_id": "system", 00:13:25.784 "dma_device_type": 1 00:13:25.784 }, 00:13:25.784 { 00:13:25.784 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.784 "dma_device_type": 2 00:13:25.784 } 00:13:25.784 ], 00:13:25.784 "driver_specific": {} 00:13:25.784 } 00:13:25.784 ] 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.784 "name": "Existed_Raid", 00:13:25.784 "uuid": "2eef54c4-7bab-4f42-a0cf-4b0a456fcd2c", 00:13:25.784 "strip_size_kb": 64, 00:13:25.784 "state": "online", 00:13:25.784 "raid_level": "raid0", 00:13:25.784 "superblock": false, 00:13:25.784 "num_base_bdevs": 3, 00:13:25.784 "num_base_bdevs_discovered": 3, 00:13:25.784 "num_base_bdevs_operational": 3, 00:13:25.784 "base_bdevs_list": [ 00:13:25.784 { 00:13:25.784 "name": "BaseBdev1", 00:13:25.784 "uuid": "e6ebeca6-b14f-49a6-8eeb-1085c5f48314", 00:13:25.784 "is_configured": true, 00:13:25.784 "data_offset": 0, 00:13:25.784 "data_size": 65536 00:13:25.784 }, 00:13:25.784 { 00:13:25.784 "name": "BaseBdev2", 00:13:25.784 "uuid": "72ad0b68-3862-4fb2-ab40-64866ed2fb9d", 00:13:25.784 "is_configured": true, 00:13:25.784 "data_offset": 0, 00:13:25.784 "data_size": 65536 00:13:25.784 }, 00:13:25.784 { 00:13:25.784 "name": "BaseBdev3", 00:13:25.784 "uuid": "f93cc328-5ab6-4308-abf0-e2d731619c26", 00:13:25.784 "is_configured": true, 00:13:25.784 "data_offset": 0, 00:13:25.784 "data_size": 65536 00:13:25.784 } 00:13:25.784 ] 00:13:25.784 }' 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.784 13:22:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:26.353 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:26.613 [2024-07-25 13:22:07.242515] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:26.613 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:26.613 "name": "Existed_Raid", 00:13:26.613 "aliases": [ 00:13:26.613 "2eef54c4-7bab-4f42-a0cf-4b0a456fcd2c" 00:13:26.613 ], 00:13:26.613 "product_name": "Raid Volume", 00:13:26.613 "block_size": 512, 00:13:26.613 "num_blocks": 196608, 00:13:26.613 "uuid": "2eef54c4-7bab-4f42-a0cf-4b0a456fcd2c", 00:13:26.613 "assigned_rate_limits": { 00:13:26.613 "rw_ios_per_sec": 0, 00:13:26.613 "rw_mbytes_per_sec": 0, 00:13:26.613 "r_mbytes_per_sec": 0, 00:13:26.613 "w_mbytes_per_sec": 0 00:13:26.613 }, 00:13:26.613 "claimed": false, 00:13:26.613 "zoned": false, 00:13:26.613 "supported_io_types": { 00:13:26.613 "read": true, 00:13:26.613 "write": true, 00:13:26.613 "unmap": true, 00:13:26.613 "flush": true, 00:13:26.613 "reset": true, 00:13:26.613 "nvme_admin": false, 00:13:26.613 "nvme_io": false, 00:13:26.613 "nvme_io_md": false, 00:13:26.613 "write_zeroes": true, 00:13:26.613 "zcopy": false, 00:13:26.613 "get_zone_info": false, 00:13:26.613 "zone_management": false, 00:13:26.613 "zone_append": false, 00:13:26.613 "compare": false, 00:13:26.613 "compare_and_write": false, 00:13:26.613 "abort": false, 00:13:26.613 "seek_hole": false, 00:13:26.613 "seek_data": false, 00:13:26.613 "copy": false, 00:13:26.613 "nvme_iov_md": false 00:13:26.613 }, 00:13:26.613 "memory_domains": [ 00:13:26.613 { 00:13:26.613 "dma_device_id": "system", 00:13:26.613 "dma_device_type": 1 00:13:26.613 }, 00:13:26.613 { 00:13:26.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.613 "dma_device_type": 2 00:13:26.613 }, 00:13:26.613 { 00:13:26.613 "dma_device_id": "system", 00:13:26.613 "dma_device_type": 1 00:13:26.613 }, 00:13:26.613 { 00:13:26.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.613 "dma_device_type": 2 00:13:26.613 }, 00:13:26.613 { 00:13:26.613 "dma_device_id": "system", 00:13:26.613 "dma_device_type": 1 00:13:26.613 }, 00:13:26.613 { 00:13:26.613 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.613 "dma_device_type": 2 00:13:26.613 } 00:13:26.613 ], 00:13:26.613 "driver_specific": { 00:13:26.613 "raid": { 00:13:26.613 "uuid": "2eef54c4-7bab-4f42-a0cf-4b0a456fcd2c", 00:13:26.613 "strip_size_kb": 64, 00:13:26.613 "state": "online", 00:13:26.613 "raid_level": "raid0", 00:13:26.613 "superblock": false, 00:13:26.613 "num_base_bdevs": 3, 00:13:26.613 "num_base_bdevs_discovered": 3, 00:13:26.613 "num_base_bdevs_operational": 3, 00:13:26.613 "base_bdevs_list": [ 00:13:26.613 { 00:13:26.613 "name": "BaseBdev1", 00:13:26.613 "uuid": "e6ebeca6-b14f-49a6-8eeb-1085c5f48314", 00:13:26.613 "is_configured": true, 00:13:26.613 "data_offset": 0, 00:13:26.613 "data_size": 65536 00:13:26.613 }, 00:13:26.613 { 00:13:26.613 "name": "BaseBdev2", 00:13:26.613 "uuid": "72ad0b68-3862-4fb2-ab40-64866ed2fb9d", 00:13:26.613 "is_configured": true, 00:13:26.613 "data_offset": 0, 00:13:26.613 "data_size": 65536 00:13:26.613 }, 00:13:26.613 { 00:13:26.613 "name": "BaseBdev3", 00:13:26.613 "uuid": "f93cc328-5ab6-4308-abf0-e2d731619c26", 00:13:26.613 "is_configured": true, 00:13:26.613 "data_offset": 0, 00:13:26.613 "data_size": 65536 00:13:26.613 } 00:13:26.613 ] 00:13:26.613 } 00:13:26.613 } 00:13:26.613 }' 00:13:26.613 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:26.613 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:26.613 BaseBdev2 00:13:26.613 BaseBdev3' 00:13:26.613 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:26.613 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:26.613 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:26.874 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:26.874 "name": "BaseBdev1", 00:13:26.874 "aliases": [ 00:13:26.874 "e6ebeca6-b14f-49a6-8eeb-1085c5f48314" 00:13:26.874 ], 00:13:26.874 "product_name": "Malloc disk", 00:13:26.874 "block_size": 512, 00:13:26.874 "num_blocks": 65536, 00:13:26.874 "uuid": "e6ebeca6-b14f-49a6-8eeb-1085c5f48314", 00:13:26.874 "assigned_rate_limits": { 00:13:26.874 "rw_ios_per_sec": 0, 00:13:26.874 "rw_mbytes_per_sec": 0, 00:13:26.874 "r_mbytes_per_sec": 0, 00:13:26.874 "w_mbytes_per_sec": 0 00:13:26.874 }, 00:13:26.874 "claimed": true, 00:13:26.874 "claim_type": "exclusive_write", 00:13:26.874 "zoned": false, 00:13:26.874 "supported_io_types": { 00:13:26.874 "read": true, 00:13:26.874 "write": true, 00:13:26.874 "unmap": true, 00:13:26.874 "flush": true, 00:13:26.874 "reset": true, 00:13:26.874 "nvme_admin": false, 00:13:26.874 "nvme_io": false, 00:13:26.874 "nvme_io_md": false, 00:13:26.874 "write_zeroes": true, 00:13:26.874 "zcopy": true, 00:13:26.874 "get_zone_info": false, 00:13:26.874 "zone_management": false, 00:13:26.874 "zone_append": false, 00:13:26.874 "compare": false, 00:13:26.874 "compare_and_write": false, 00:13:26.874 "abort": true, 00:13:26.874 "seek_hole": false, 00:13:26.874 "seek_data": false, 00:13:26.874 "copy": true, 00:13:26.874 "nvme_iov_md": false 00:13:26.874 }, 00:13:26.874 "memory_domains": [ 00:13:26.874 { 00:13:26.874 "dma_device_id": "system", 00:13:26.874 "dma_device_type": 1 00:13:26.874 }, 00:13:26.874 { 00:13:26.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:26.874 "dma_device_type": 2 00:13:26.874 } 00:13:26.874 ], 00:13:26.874 "driver_specific": {} 00:13:26.874 }' 00:13:26.874 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.874 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:26.874 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:26.874 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:26.874 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:27.135 13:22:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:27.396 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:27.396 "name": "BaseBdev2", 00:13:27.396 "aliases": [ 00:13:27.396 "72ad0b68-3862-4fb2-ab40-64866ed2fb9d" 00:13:27.396 ], 00:13:27.396 "product_name": "Malloc disk", 00:13:27.396 "block_size": 512, 00:13:27.396 "num_blocks": 65536, 00:13:27.396 "uuid": "72ad0b68-3862-4fb2-ab40-64866ed2fb9d", 00:13:27.396 "assigned_rate_limits": { 00:13:27.396 "rw_ios_per_sec": 0, 00:13:27.396 "rw_mbytes_per_sec": 0, 00:13:27.396 "r_mbytes_per_sec": 0, 00:13:27.396 "w_mbytes_per_sec": 0 00:13:27.396 }, 00:13:27.396 "claimed": true, 00:13:27.396 "claim_type": "exclusive_write", 00:13:27.396 "zoned": false, 00:13:27.396 "supported_io_types": { 00:13:27.396 "read": true, 00:13:27.396 "write": true, 00:13:27.396 "unmap": true, 00:13:27.396 "flush": true, 00:13:27.396 "reset": true, 00:13:27.396 "nvme_admin": false, 00:13:27.396 "nvme_io": false, 00:13:27.396 "nvme_io_md": false, 00:13:27.396 "write_zeroes": true, 00:13:27.396 "zcopy": true, 00:13:27.396 "get_zone_info": false, 00:13:27.396 "zone_management": false, 00:13:27.396 "zone_append": false, 00:13:27.396 "compare": false, 00:13:27.396 "compare_and_write": false, 00:13:27.396 "abort": true, 00:13:27.396 "seek_hole": false, 00:13:27.396 "seek_data": false, 00:13:27.396 "copy": true, 00:13:27.396 "nvme_iov_md": false 00:13:27.396 }, 00:13:27.396 "memory_domains": [ 00:13:27.396 { 00:13:27.396 "dma_device_id": "system", 00:13:27.396 "dma_device_type": 1 00:13:27.396 }, 00:13:27.396 { 00:13:27.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.396 "dma_device_type": 2 00:13:27.396 } 00:13:27.396 ], 00:13:27.396 "driver_specific": {} 00:13:27.396 }' 00:13:27.396 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.396 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.396 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:27.396 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.396 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:27.656 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:27.916 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:27.916 "name": "BaseBdev3", 00:13:27.916 "aliases": [ 00:13:27.916 "f93cc328-5ab6-4308-abf0-e2d731619c26" 00:13:27.916 ], 00:13:27.916 "product_name": "Malloc disk", 00:13:27.916 "block_size": 512, 00:13:27.916 "num_blocks": 65536, 00:13:27.916 "uuid": "f93cc328-5ab6-4308-abf0-e2d731619c26", 00:13:27.916 "assigned_rate_limits": { 00:13:27.916 "rw_ios_per_sec": 0, 00:13:27.916 "rw_mbytes_per_sec": 0, 00:13:27.916 "r_mbytes_per_sec": 0, 00:13:27.916 "w_mbytes_per_sec": 0 00:13:27.916 }, 00:13:27.916 "claimed": true, 00:13:27.916 "claim_type": "exclusive_write", 00:13:27.916 "zoned": false, 00:13:27.916 "supported_io_types": { 00:13:27.916 "read": true, 00:13:27.916 "write": true, 00:13:27.917 "unmap": true, 00:13:27.917 "flush": true, 00:13:27.917 "reset": true, 00:13:27.917 "nvme_admin": false, 00:13:27.917 "nvme_io": false, 00:13:27.917 "nvme_io_md": false, 00:13:27.917 "write_zeroes": true, 00:13:27.917 "zcopy": true, 00:13:27.917 "get_zone_info": false, 00:13:27.917 "zone_management": false, 00:13:27.917 "zone_append": false, 00:13:27.917 "compare": false, 00:13:27.917 "compare_and_write": false, 00:13:27.917 "abort": true, 00:13:27.917 "seek_hole": false, 00:13:27.917 "seek_data": false, 00:13:27.917 "copy": true, 00:13:27.917 "nvme_iov_md": false 00:13:27.917 }, 00:13:27.917 "memory_domains": [ 00:13:27.917 { 00:13:27.917 "dma_device_id": "system", 00:13:27.917 "dma_device_type": 1 00:13:27.917 }, 00:13:27.917 { 00:13:27.917 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:27.917 "dma_device_type": 2 00:13:27.917 } 00:13:27.917 ], 00:13:27.917 "driver_specific": {} 00:13:27.917 }' 00:13:27.917 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.917 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:27.917 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:27.917 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:27.917 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:28.176 13:22:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:28.436 [2024-07-25 13:22:09.054899] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:28.436 [2024-07-25 13:22:09.054917] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:28.436 [2024-07-25 13:22:09.054949] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.436 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.696 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.696 "name": "Existed_Raid", 00:13:28.696 "uuid": "2eef54c4-7bab-4f42-a0cf-4b0a456fcd2c", 00:13:28.696 "strip_size_kb": 64, 00:13:28.696 "state": "offline", 00:13:28.696 "raid_level": "raid0", 00:13:28.696 "superblock": false, 00:13:28.696 "num_base_bdevs": 3, 00:13:28.696 "num_base_bdevs_discovered": 2, 00:13:28.696 "num_base_bdevs_operational": 2, 00:13:28.696 "base_bdevs_list": [ 00:13:28.696 { 00:13:28.696 "name": null, 00:13:28.696 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.696 "is_configured": false, 00:13:28.696 "data_offset": 0, 00:13:28.696 "data_size": 65536 00:13:28.696 }, 00:13:28.696 { 00:13:28.696 "name": "BaseBdev2", 00:13:28.696 "uuid": "72ad0b68-3862-4fb2-ab40-64866ed2fb9d", 00:13:28.696 "is_configured": true, 00:13:28.696 "data_offset": 0, 00:13:28.696 "data_size": 65536 00:13:28.696 }, 00:13:28.696 { 00:13:28.696 "name": "BaseBdev3", 00:13:28.696 "uuid": "f93cc328-5ab6-4308-abf0-e2d731619c26", 00:13:28.696 "is_configured": true, 00:13:28.696 "data_offset": 0, 00:13:28.696 "data_size": 65536 00:13:28.696 } 00:13:28.696 ] 00:13:28.696 }' 00:13:28.696 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.696 13:22:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:29.266 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:29.266 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:29.266 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.266 13:22:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:29.266 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:29.266 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:29.266 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:29.525 [2024-07-25 13:22:10.197797] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:29.525 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:29.525 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:29.525 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.525 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:29.784 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:29.784 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:29.784 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:30.043 [2024-07-25 13:22:10.588612] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:30.043 [2024-07-25 13:22:10.588643] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0bea0 name Existed_Raid, state offline 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:30.043 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:30.302 BaseBdev2 00:13:30.303 13:22:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:30.303 13:22:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:30.303 13:22:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:30.303 13:22:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:30.303 13:22:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:30.303 13:22:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:30.303 13:22:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:30.562 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:30.562 [ 00:13:30.562 { 00:13:30.562 "name": "BaseBdev2", 00:13:30.562 "aliases": [ 00:13:30.562 "7da9f437-97e7-4de7-9f5d-757cd777ca0b" 00:13:30.562 ], 00:13:30.562 "product_name": "Malloc disk", 00:13:30.562 "block_size": 512, 00:13:30.562 "num_blocks": 65536, 00:13:30.562 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:30.562 "assigned_rate_limits": { 00:13:30.562 "rw_ios_per_sec": 0, 00:13:30.562 "rw_mbytes_per_sec": 0, 00:13:30.562 "r_mbytes_per_sec": 0, 00:13:30.562 "w_mbytes_per_sec": 0 00:13:30.562 }, 00:13:30.562 "claimed": false, 00:13:30.562 "zoned": false, 00:13:30.562 "supported_io_types": { 00:13:30.562 "read": true, 00:13:30.562 "write": true, 00:13:30.562 "unmap": true, 00:13:30.562 "flush": true, 00:13:30.562 "reset": true, 00:13:30.562 "nvme_admin": false, 00:13:30.562 "nvme_io": false, 00:13:30.562 "nvme_io_md": false, 00:13:30.562 "write_zeroes": true, 00:13:30.562 "zcopy": true, 00:13:30.562 "get_zone_info": false, 00:13:30.562 "zone_management": false, 00:13:30.562 "zone_append": false, 00:13:30.562 "compare": false, 00:13:30.562 "compare_and_write": false, 00:13:30.562 "abort": true, 00:13:30.562 "seek_hole": false, 00:13:30.562 "seek_data": false, 00:13:30.562 "copy": true, 00:13:30.562 "nvme_iov_md": false 00:13:30.562 }, 00:13:30.562 "memory_domains": [ 00:13:30.562 { 00:13:30.562 "dma_device_id": "system", 00:13:30.562 "dma_device_type": 1 00:13:30.562 }, 00:13:30.562 { 00:13:30.562 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.562 "dma_device_type": 2 00:13:30.562 } 00:13:30.562 ], 00:13:30.562 "driver_specific": {} 00:13:30.562 } 00:13:30.562 ] 00:13:30.562 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:30.562 13:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:30.562 13:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:30.562 13:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:30.821 BaseBdev3 00:13:30.821 13:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:30.821 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:30.821 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:30.821 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:30.821 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:30.821 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:30.821 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:31.080 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:31.339 [ 00:13:31.339 { 00:13:31.339 "name": "BaseBdev3", 00:13:31.339 "aliases": [ 00:13:31.339 "76967af1-5511-4ae4-8373-2d1361fcf83b" 00:13:31.339 ], 00:13:31.339 "product_name": "Malloc disk", 00:13:31.339 "block_size": 512, 00:13:31.339 "num_blocks": 65536, 00:13:31.339 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:31.339 "assigned_rate_limits": { 00:13:31.339 "rw_ios_per_sec": 0, 00:13:31.339 "rw_mbytes_per_sec": 0, 00:13:31.339 "r_mbytes_per_sec": 0, 00:13:31.339 "w_mbytes_per_sec": 0 00:13:31.339 }, 00:13:31.339 "claimed": false, 00:13:31.339 "zoned": false, 00:13:31.339 "supported_io_types": { 00:13:31.339 "read": true, 00:13:31.339 "write": true, 00:13:31.339 "unmap": true, 00:13:31.339 "flush": true, 00:13:31.339 "reset": true, 00:13:31.339 "nvme_admin": false, 00:13:31.340 "nvme_io": false, 00:13:31.340 "nvme_io_md": false, 00:13:31.340 "write_zeroes": true, 00:13:31.340 "zcopy": true, 00:13:31.340 "get_zone_info": false, 00:13:31.340 "zone_management": false, 00:13:31.340 "zone_append": false, 00:13:31.340 "compare": false, 00:13:31.340 "compare_and_write": false, 00:13:31.340 "abort": true, 00:13:31.340 "seek_hole": false, 00:13:31.340 "seek_data": false, 00:13:31.340 "copy": true, 00:13:31.340 "nvme_iov_md": false 00:13:31.340 }, 00:13:31.340 "memory_domains": [ 00:13:31.340 { 00:13:31.340 "dma_device_id": "system", 00:13:31.340 "dma_device_type": 1 00:13:31.340 }, 00:13:31.340 { 00:13:31.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.340 "dma_device_type": 2 00:13:31.340 } 00:13:31.340 ], 00:13:31.340 "driver_specific": {} 00:13:31.340 } 00:13:31.340 ] 00:13:31.340 13:22:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:31.340 13:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:31.340 13:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:31.340 13:22:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:31.340 [2024-07-25 13:22:12.076276] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:31.340 [2024-07-25 13:22:12.076311] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:31.340 [2024-07-25 13:22:12.076323] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:31.340 [2024-07-25 13:22:12.077345] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:31.340 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:31.599 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:31.599 "name": "Existed_Raid", 00:13:31.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.599 "strip_size_kb": 64, 00:13:31.599 "state": "configuring", 00:13:31.599 "raid_level": "raid0", 00:13:31.599 "superblock": false, 00:13:31.599 "num_base_bdevs": 3, 00:13:31.599 "num_base_bdevs_discovered": 2, 00:13:31.599 "num_base_bdevs_operational": 3, 00:13:31.599 "base_bdevs_list": [ 00:13:31.599 { 00:13:31.599 "name": "BaseBdev1", 00:13:31.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:31.599 "is_configured": false, 00:13:31.599 "data_offset": 0, 00:13:31.599 "data_size": 0 00:13:31.599 }, 00:13:31.599 { 00:13:31.599 "name": "BaseBdev2", 00:13:31.600 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:31.600 "is_configured": true, 00:13:31.600 "data_offset": 0, 00:13:31.600 "data_size": 65536 00:13:31.600 }, 00:13:31.600 { 00:13:31.600 "name": "BaseBdev3", 00:13:31.600 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:31.600 "is_configured": true, 00:13:31.600 "data_offset": 0, 00:13:31.600 "data_size": 65536 00:13:31.600 } 00:13:31.600 ] 00:13:31.600 }' 00:13:31.600 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:31.600 13:22:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.168 13:22:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:32.427 [2024-07-25 13:22:12.990576] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:32.428 "name": "Existed_Raid", 00:13:32.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.428 "strip_size_kb": 64, 00:13:32.428 "state": "configuring", 00:13:32.428 "raid_level": "raid0", 00:13:32.428 "superblock": false, 00:13:32.428 "num_base_bdevs": 3, 00:13:32.428 "num_base_bdevs_discovered": 1, 00:13:32.428 "num_base_bdevs_operational": 3, 00:13:32.428 "base_bdevs_list": [ 00:13:32.428 { 00:13:32.428 "name": "BaseBdev1", 00:13:32.428 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:32.428 "is_configured": false, 00:13:32.428 "data_offset": 0, 00:13:32.428 "data_size": 0 00:13:32.428 }, 00:13:32.428 { 00:13:32.428 "name": null, 00:13:32.428 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:32.428 "is_configured": false, 00:13:32.428 "data_offset": 0, 00:13:32.428 "data_size": 65536 00:13:32.428 }, 00:13:32.428 { 00:13:32.428 "name": "BaseBdev3", 00:13:32.428 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:32.428 "is_configured": true, 00:13:32.428 "data_offset": 0, 00:13:32.428 "data_size": 65536 00:13:32.428 } 00:13:32.428 ] 00:13:32.428 }' 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:32.428 13:22:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:32.996 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.996 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:33.255 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:33.255 13:22:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:33.514 [2024-07-25 13:22:14.114465] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:33.514 BaseBdev1 00:13:33.514 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:33.514 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:33.514 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:33.514 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:33.514 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:33.514 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:33.514 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:33.773 [ 00:13:33.773 { 00:13:33.773 "name": "BaseBdev1", 00:13:33.773 "aliases": [ 00:13:33.773 "483dfbcd-420e-4dcb-af2a-698fe3465b0f" 00:13:33.773 ], 00:13:33.773 "product_name": "Malloc disk", 00:13:33.773 "block_size": 512, 00:13:33.773 "num_blocks": 65536, 00:13:33.773 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:33.773 "assigned_rate_limits": { 00:13:33.773 "rw_ios_per_sec": 0, 00:13:33.773 "rw_mbytes_per_sec": 0, 00:13:33.773 "r_mbytes_per_sec": 0, 00:13:33.773 "w_mbytes_per_sec": 0 00:13:33.773 }, 00:13:33.773 "claimed": true, 00:13:33.773 "claim_type": "exclusive_write", 00:13:33.773 "zoned": false, 00:13:33.773 "supported_io_types": { 00:13:33.773 "read": true, 00:13:33.773 "write": true, 00:13:33.773 "unmap": true, 00:13:33.773 "flush": true, 00:13:33.773 "reset": true, 00:13:33.773 "nvme_admin": false, 00:13:33.773 "nvme_io": false, 00:13:33.773 "nvme_io_md": false, 00:13:33.773 "write_zeroes": true, 00:13:33.773 "zcopy": true, 00:13:33.773 "get_zone_info": false, 00:13:33.773 "zone_management": false, 00:13:33.773 "zone_append": false, 00:13:33.773 "compare": false, 00:13:33.773 "compare_and_write": false, 00:13:33.773 "abort": true, 00:13:33.773 "seek_hole": false, 00:13:33.773 "seek_data": false, 00:13:33.773 "copy": true, 00:13:33.773 "nvme_iov_md": false 00:13:33.773 }, 00:13:33.773 "memory_domains": [ 00:13:33.773 { 00:13:33.773 "dma_device_id": "system", 00:13:33.773 "dma_device_type": 1 00:13:33.773 }, 00:13:33.773 { 00:13:33.773 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:33.773 "dma_device_type": 2 00:13:33.773 } 00:13:33.773 ], 00:13:33.773 "driver_specific": {} 00:13:33.773 } 00:13:33.773 ] 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:33.773 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:34.032 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:34.032 "name": "Existed_Raid", 00:13:34.032 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:34.032 "strip_size_kb": 64, 00:13:34.032 "state": "configuring", 00:13:34.032 "raid_level": "raid0", 00:13:34.032 "superblock": false, 00:13:34.032 "num_base_bdevs": 3, 00:13:34.032 "num_base_bdevs_discovered": 2, 00:13:34.032 "num_base_bdevs_operational": 3, 00:13:34.032 "base_bdevs_list": [ 00:13:34.032 { 00:13:34.032 "name": "BaseBdev1", 00:13:34.032 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:34.032 "is_configured": true, 00:13:34.032 "data_offset": 0, 00:13:34.032 "data_size": 65536 00:13:34.032 }, 00:13:34.032 { 00:13:34.032 "name": null, 00:13:34.032 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:34.032 "is_configured": false, 00:13:34.032 "data_offset": 0, 00:13:34.032 "data_size": 65536 00:13:34.032 }, 00:13:34.032 { 00:13:34.032 "name": "BaseBdev3", 00:13:34.032 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:34.032 "is_configured": true, 00:13:34.032 "data_offset": 0, 00:13:34.032 "data_size": 65536 00:13:34.032 } 00:13:34.032 ] 00:13:34.032 }' 00:13:34.032 13:22:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:34.032 13:22:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.600 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.600 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:34.860 [2024-07-25 13:22:15.566158] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.860 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:35.119 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:35.119 "name": "Existed_Raid", 00:13:35.119 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:35.119 "strip_size_kb": 64, 00:13:35.119 "state": "configuring", 00:13:35.119 "raid_level": "raid0", 00:13:35.119 "superblock": false, 00:13:35.119 "num_base_bdevs": 3, 00:13:35.119 "num_base_bdevs_discovered": 1, 00:13:35.119 "num_base_bdevs_operational": 3, 00:13:35.119 "base_bdevs_list": [ 00:13:35.119 { 00:13:35.119 "name": "BaseBdev1", 00:13:35.119 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:35.119 "is_configured": true, 00:13:35.119 "data_offset": 0, 00:13:35.119 "data_size": 65536 00:13:35.119 }, 00:13:35.119 { 00:13:35.119 "name": null, 00:13:35.119 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:35.119 "is_configured": false, 00:13:35.119 "data_offset": 0, 00:13:35.119 "data_size": 65536 00:13:35.119 }, 00:13:35.119 { 00:13:35.119 "name": null, 00:13:35.119 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:35.119 "is_configured": false, 00:13:35.119 "data_offset": 0, 00:13:35.119 "data_size": 65536 00:13:35.119 } 00:13:35.119 ] 00:13:35.119 }' 00:13:35.119 13:22:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:35.119 13:22:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:35.688 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.688 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:35.946 [2024-07-25 13:22:16.713077] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:35.946 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:36.204 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:36.204 "name": "Existed_Raid", 00:13:36.204 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:36.204 "strip_size_kb": 64, 00:13:36.204 "state": "configuring", 00:13:36.204 "raid_level": "raid0", 00:13:36.204 "superblock": false, 00:13:36.204 "num_base_bdevs": 3, 00:13:36.204 "num_base_bdevs_discovered": 2, 00:13:36.204 "num_base_bdevs_operational": 3, 00:13:36.204 "base_bdevs_list": [ 00:13:36.204 { 00:13:36.204 "name": "BaseBdev1", 00:13:36.204 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:36.204 "is_configured": true, 00:13:36.204 "data_offset": 0, 00:13:36.204 "data_size": 65536 00:13:36.204 }, 00:13:36.204 { 00:13:36.204 "name": null, 00:13:36.204 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:36.204 "is_configured": false, 00:13:36.204 "data_offset": 0, 00:13:36.204 "data_size": 65536 00:13:36.204 }, 00:13:36.204 { 00:13:36.204 "name": "BaseBdev3", 00:13:36.204 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:36.204 "is_configured": true, 00:13:36.204 "data_offset": 0, 00:13:36.204 "data_size": 65536 00:13:36.204 } 00:13:36.204 ] 00:13:36.204 }' 00:13:36.204 13:22:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:36.204 13:22:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:36.773 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.773 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:37.034 [2024-07-25 13:22:17.803837] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:37.034 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:37.295 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.295 13:22:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.295 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.295 "name": "Existed_Raid", 00:13:37.295 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.295 "strip_size_kb": 64, 00:13:37.295 "state": "configuring", 00:13:37.295 "raid_level": "raid0", 00:13:37.295 "superblock": false, 00:13:37.295 "num_base_bdevs": 3, 00:13:37.295 "num_base_bdevs_discovered": 1, 00:13:37.295 "num_base_bdevs_operational": 3, 00:13:37.295 "base_bdevs_list": [ 00:13:37.295 { 00:13:37.295 "name": null, 00:13:37.295 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:37.295 "is_configured": false, 00:13:37.295 "data_offset": 0, 00:13:37.295 "data_size": 65536 00:13:37.295 }, 00:13:37.295 { 00:13:37.295 "name": null, 00:13:37.295 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:37.295 "is_configured": false, 00:13:37.295 "data_offset": 0, 00:13:37.295 "data_size": 65536 00:13:37.295 }, 00:13:37.295 { 00:13:37.295 "name": "BaseBdev3", 00:13:37.295 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:37.295 "is_configured": true, 00:13:37.295 "data_offset": 0, 00:13:37.295 "data_size": 65536 00:13:37.295 } 00:13:37.295 ] 00:13:37.295 }' 00:13:37.295 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.295 13:22:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.865 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:37.865 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:38.125 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:38.125 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:38.387 [2024-07-25 13:22:18.924591] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.387 13:22:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:38.387 13:22:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.387 "name": "Existed_Raid", 00:13:38.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.387 "strip_size_kb": 64, 00:13:38.387 "state": "configuring", 00:13:38.387 "raid_level": "raid0", 00:13:38.387 "superblock": false, 00:13:38.387 "num_base_bdevs": 3, 00:13:38.387 "num_base_bdevs_discovered": 2, 00:13:38.387 "num_base_bdevs_operational": 3, 00:13:38.387 "base_bdevs_list": [ 00:13:38.387 { 00:13:38.387 "name": null, 00:13:38.387 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:38.387 "is_configured": false, 00:13:38.387 "data_offset": 0, 00:13:38.387 "data_size": 65536 00:13:38.387 }, 00:13:38.387 { 00:13:38.387 "name": "BaseBdev2", 00:13:38.387 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:38.387 "is_configured": true, 00:13:38.387 "data_offset": 0, 00:13:38.387 "data_size": 65536 00:13:38.387 }, 00:13:38.387 { 00:13:38.387 "name": "BaseBdev3", 00:13:38.387 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:38.387 "is_configured": true, 00:13:38.387 "data_offset": 0, 00:13:38.387 "data_size": 65536 00:13:38.387 } 00:13:38.387 ] 00:13:38.387 }' 00:13:38.387 13:22:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.387 13:22:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:39.327 13:22:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.327 13:22:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:39.587 13:22:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:39.587 13:22:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.587 13:22:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:40.158 13:22:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 483dfbcd-420e-4dcb-af2a-698fe3465b0f 00:13:40.418 [2024-07-25 13:22:20.958814] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:40.418 [2024-07-25 13:22:20.958843] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f0a120 00:13:40.418 [2024-07-25 13:22:20.958847] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:40.418 [2024-07-25 13:22:20.958993] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f16290 00:13:40.418 [2024-07-25 13:22:20.959083] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f0a120 00:13:40.418 [2024-07-25 13:22:20.959088] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1f0a120 00:13:40.418 [2024-07-25 13:22:20.959208] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:40.418 NewBaseBdev 00:13:40.418 13:22:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:40.418 13:22:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:13:40.418 13:22:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:40.418 13:22:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:13:40.418 13:22:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:40.418 13:22:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:40.418 13:22:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:40.418 13:22:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:40.678 [ 00:13:40.678 { 00:13:40.678 "name": "NewBaseBdev", 00:13:40.678 "aliases": [ 00:13:40.679 "483dfbcd-420e-4dcb-af2a-698fe3465b0f" 00:13:40.679 ], 00:13:40.679 "product_name": "Malloc disk", 00:13:40.679 "block_size": 512, 00:13:40.679 "num_blocks": 65536, 00:13:40.679 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:40.679 "assigned_rate_limits": { 00:13:40.679 "rw_ios_per_sec": 0, 00:13:40.679 "rw_mbytes_per_sec": 0, 00:13:40.679 "r_mbytes_per_sec": 0, 00:13:40.679 "w_mbytes_per_sec": 0 00:13:40.679 }, 00:13:40.679 "claimed": true, 00:13:40.679 "claim_type": "exclusive_write", 00:13:40.679 "zoned": false, 00:13:40.679 "supported_io_types": { 00:13:40.679 "read": true, 00:13:40.679 "write": true, 00:13:40.679 "unmap": true, 00:13:40.679 "flush": true, 00:13:40.679 "reset": true, 00:13:40.679 "nvme_admin": false, 00:13:40.679 "nvme_io": false, 00:13:40.679 "nvme_io_md": false, 00:13:40.679 "write_zeroes": true, 00:13:40.679 "zcopy": true, 00:13:40.679 "get_zone_info": false, 00:13:40.679 "zone_management": false, 00:13:40.679 "zone_append": false, 00:13:40.679 "compare": false, 00:13:40.679 "compare_and_write": false, 00:13:40.679 "abort": true, 00:13:40.679 "seek_hole": false, 00:13:40.679 "seek_data": false, 00:13:40.679 "copy": true, 00:13:40.679 "nvme_iov_md": false 00:13:40.679 }, 00:13:40.679 "memory_domains": [ 00:13:40.679 { 00:13:40.679 "dma_device_id": "system", 00:13:40.679 "dma_device_type": 1 00:13:40.679 }, 00:13:40.679 { 00:13:40.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:40.679 "dma_device_type": 2 00:13:40.679 } 00:13:40.679 ], 00:13:40.679 "driver_specific": {} 00:13:40.679 } 00:13:40.679 ] 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:40.679 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.940 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:40.940 "name": "Existed_Raid", 00:13:40.940 "uuid": "4646aa56-d8f7-4e8f-a942-b2c8a5941ad9", 00:13:40.940 "strip_size_kb": 64, 00:13:40.940 "state": "online", 00:13:40.940 "raid_level": "raid0", 00:13:40.940 "superblock": false, 00:13:40.940 "num_base_bdevs": 3, 00:13:40.940 "num_base_bdevs_discovered": 3, 00:13:40.940 "num_base_bdevs_operational": 3, 00:13:40.940 "base_bdevs_list": [ 00:13:40.940 { 00:13:40.940 "name": "NewBaseBdev", 00:13:40.940 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:40.940 "is_configured": true, 00:13:40.940 "data_offset": 0, 00:13:40.940 "data_size": 65536 00:13:40.940 }, 00:13:40.940 { 00:13:40.940 "name": "BaseBdev2", 00:13:40.940 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:40.940 "is_configured": true, 00:13:40.940 "data_offset": 0, 00:13:40.940 "data_size": 65536 00:13:40.940 }, 00:13:40.940 { 00:13:40.940 "name": "BaseBdev3", 00:13:40.940 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:40.940 "is_configured": true, 00:13:40.940 "data_offset": 0, 00:13:40.940 "data_size": 65536 00:13:40.940 } 00:13:40.940 ] 00:13:40.940 }' 00:13:40.940 13:22:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:40.940 13:22:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:41.511 [2024-07-25 13:22:22.266377] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:41.511 "name": "Existed_Raid", 00:13:41.511 "aliases": [ 00:13:41.511 "4646aa56-d8f7-4e8f-a942-b2c8a5941ad9" 00:13:41.511 ], 00:13:41.511 "product_name": "Raid Volume", 00:13:41.511 "block_size": 512, 00:13:41.511 "num_blocks": 196608, 00:13:41.511 "uuid": "4646aa56-d8f7-4e8f-a942-b2c8a5941ad9", 00:13:41.511 "assigned_rate_limits": { 00:13:41.511 "rw_ios_per_sec": 0, 00:13:41.511 "rw_mbytes_per_sec": 0, 00:13:41.511 "r_mbytes_per_sec": 0, 00:13:41.511 "w_mbytes_per_sec": 0 00:13:41.511 }, 00:13:41.511 "claimed": false, 00:13:41.511 "zoned": false, 00:13:41.511 "supported_io_types": { 00:13:41.511 "read": true, 00:13:41.511 "write": true, 00:13:41.511 "unmap": true, 00:13:41.511 "flush": true, 00:13:41.511 "reset": true, 00:13:41.511 "nvme_admin": false, 00:13:41.511 "nvme_io": false, 00:13:41.511 "nvme_io_md": false, 00:13:41.511 "write_zeroes": true, 00:13:41.511 "zcopy": false, 00:13:41.511 "get_zone_info": false, 00:13:41.511 "zone_management": false, 00:13:41.511 "zone_append": false, 00:13:41.511 "compare": false, 00:13:41.511 "compare_and_write": false, 00:13:41.511 "abort": false, 00:13:41.511 "seek_hole": false, 00:13:41.511 "seek_data": false, 00:13:41.511 "copy": false, 00:13:41.511 "nvme_iov_md": false 00:13:41.511 }, 00:13:41.511 "memory_domains": [ 00:13:41.511 { 00:13:41.511 "dma_device_id": "system", 00:13:41.511 "dma_device_type": 1 00:13:41.511 }, 00:13:41.511 { 00:13:41.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.511 "dma_device_type": 2 00:13:41.511 }, 00:13:41.511 { 00:13:41.511 "dma_device_id": "system", 00:13:41.511 "dma_device_type": 1 00:13:41.511 }, 00:13:41.511 { 00:13:41.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.511 "dma_device_type": 2 00:13:41.511 }, 00:13:41.511 { 00:13:41.511 "dma_device_id": "system", 00:13:41.511 "dma_device_type": 1 00:13:41.511 }, 00:13:41.511 { 00:13:41.511 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:41.511 "dma_device_type": 2 00:13:41.511 } 00:13:41.511 ], 00:13:41.511 "driver_specific": { 00:13:41.511 "raid": { 00:13:41.511 "uuid": "4646aa56-d8f7-4e8f-a942-b2c8a5941ad9", 00:13:41.511 "strip_size_kb": 64, 00:13:41.511 "state": "online", 00:13:41.511 "raid_level": "raid0", 00:13:41.511 "superblock": false, 00:13:41.511 "num_base_bdevs": 3, 00:13:41.511 "num_base_bdevs_discovered": 3, 00:13:41.511 "num_base_bdevs_operational": 3, 00:13:41.511 "base_bdevs_list": [ 00:13:41.511 { 00:13:41.511 "name": "NewBaseBdev", 00:13:41.511 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:41.511 "is_configured": true, 00:13:41.511 "data_offset": 0, 00:13:41.511 "data_size": 65536 00:13:41.511 }, 00:13:41.511 { 00:13:41.511 "name": "BaseBdev2", 00:13:41.511 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:41.511 "is_configured": true, 00:13:41.511 "data_offset": 0, 00:13:41.511 "data_size": 65536 00:13:41.511 }, 00:13:41.511 { 00:13:41.511 "name": "BaseBdev3", 00:13:41.511 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:41.511 "is_configured": true, 00:13:41.511 "data_offset": 0, 00:13:41.511 "data_size": 65536 00:13:41.511 } 00:13:41.511 ] 00:13:41.511 } 00:13:41.511 } 00:13:41.511 }' 00:13:41.511 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:41.771 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:41.771 BaseBdev2 00:13:41.771 BaseBdev3' 00:13:41.771 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:41.771 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:41.771 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:42.341 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:42.341 "name": "NewBaseBdev", 00:13:42.341 "aliases": [ 00:13:42.341 "483dfbcd-420e-4dcb-af2a-698fe3465b0f" 00:13:42.341 ], 00:13:42.341 "product_name": "Malloc disk", 00:13:42.341 "block_size": 512, 00:13:42.341 "num_blocks": 65536, 00:13:42.341 "uuid": "483dfbcd-420e-4dcb-af2a-698fe3465b0f", 00:13:42.341 "assigned_rate_limits": { 00:13:42.341 "rw_ios_per_sec": 0, 00:13:42.341 "rw_mbytes_per_sec": 0, 00:13:42.341 "r_mbytes_per_sec": 0, 00:13:42.341 "w_mbytes_per_sec": 0 00:13:42.341 }, 00:13:42.341 "claimed": true, 00:13:42.341 "claim_type": "exclusive_write", 00:13:42.341 "zoned": false, 00:13:42.341 "supported_io_types": { 00:13:42.341 "read": true, 00:13:42.341 "write": true, 00:13:42.341 "unmap": true, 00:13:42.341 "flush": true, 00:13:42.341 "reset": true, 00:13:42.341 "nvme_admin": false, 00:13:42.341 "nvme_io": false, 00:13:42.341 "nvme_io_md": false, 00:13:42.341 "write_zeroes": true, 00:13:42.341 "zcopy": true, 00:13:42.341 "get_zone_info": false, 00:13:42.341 "zone_management": false, 00:13:42.341 "zone_append": false, 00:13:42.341 "compare": false, 00:13:42.341 "compare_and_write": false, 00:13:42.341 "abort": true, 00:13:42.341 "seek_hole": false, 00:13:42.341 "seek_data": false, 00:13:42.341 "copy": true, 00:13:42.341 "nvme_iov_md": false 00:13:42.341 }, 00:13:42.341 "memory_domains": [ 00:13:42.341 { 00:13:42.341 "dma_device_id": "system", 00:13:42.341 "dma_device_type": 1 00:13:42.341 }, 00:13:42.341 { 00:13:42.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.341 "dma_device_type": 2 00:13:42.341 } 00:13:42.341 ], 00:13:42.341 "driver_specific": {} 00:13:42.341 }' 00:13:42.341 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.341 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.342 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:42.342 13:22:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:42.342 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:42.342 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:42.342 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:42.601 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:42.861 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:42.861 "name": "BaseBdev2", 00:13:42.861 "aliases": [ 00:13:42.861 "7da9f437-97e7-4de7-9f5d-757cd777ca0b" 00:13:42.861 ], 00:13:42.861 "product_name": "Malloc disk", 00:13:42.861 "block_size": 512, 00:13:42.861 "num_blocks": 65536, 00:13:42.861 "uuid": "7da9f437-97e7-4de7-9f5d-757cd777ca0b", 00:13:42.861 "assigned_rate_limits": { 00:13:42.861 "rw_ios_per_sec": 0, 00:13:42.861 "rw_mbytes_per_sec": 0, 00:13:42.861 "r_mbytes_per_sec": 0, 00:13:42.861 "w_mbytes_per_sec": 0 00:13:42.861 }, 00:13:42.861 "claimed": true, 00:13:42.861 "claim_type": "exclusive_write", 00:13:42.861 "zoned": false, 00:13:42.861 "supported_io_types": { 00:13:42.861 "read": true, 00:13:42.861 "write": true, 00:13:42.861 "unmap": true, 00:13:42.861 "flush": true, 00:13:42.861 "reset": true, 00:13:42.861 "nvme_admin": false, 00:13:42.861 "nvme_io": false, 00:13:42.861 "nvme_io_md": false, 00:13:42.861 "write_zeroes": true, 00:13:42.861 "zcopy": true, 00:13:42.861 "get_zone_info": false, 00:13:42.861 "zone_management": false, 00:13:42.861 "zone_append": false, 00:13:42.861 "compare": false, 00:13:42.861 "compare_and_write": false, 00:13:42.861 "abort": true, 00:13:42.861 "seek_hole": false, 00:13:42.861 "seek_data": false, 00:13:42.861 "copy": true, 00:13:42.861 "nvme_iov_md": false 00:13:42.861 }, 00:13:42.861 "memory_domains": [ 00:13:42.861 { 00:13:42.861 "dma_device_id": "system", 00:13:42.861 "dma_device_type": 1 00:13:42.861 }, 00:13:42.861 { 00:13:42.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:42.861 "dma_device_type": 2 00:13:42.861 } 00:13:42.861 ], 00:13:42.861 "driver_specific": {} 00:13:42.861 }' 00:13:42.861 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.861 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:42.861 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:42.861 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:43.120 13:22:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:43.380 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:43.380 "name": "BaseBdev3", 00:13:43.380 "aliases": [ 00:13:43.380 "76967af1-5511-4ae4-8373-2d1361fcf83b" 00:13:43.380 ], 00:13:43.380 "product_name": "Malloc disk", 00:13:43.380 "block_size": 512, 00:13:43.380 "num_blocks": 65536, 00:13:43.380 "uuid": "76967af1-5511-4ae4-8373-2d1361fcf83b", 00:13:43.380 "assigned_rate_limits": { 00:13:43.380 "rw_ios_per_sec": 0, 00:13:43.380 "rw_mbytes_per_sec": 0, 00:13:43.380 "r_mbytes_per_sec": 0, 00:13:43.380 "w_mbytes_per_sec": 0 00:13:43.380 }, 00:13:43.380 "claimed": true, 00:13:43.380 "claim_type": "exclusive_write", 00:13:43.380 "zoned": false, 00:13:43.380 "supported_io_types": { 00:13:43.380 "read": true, 00:13:43.380 "write": true, 00:13:43.380 "unmap": true, 00:13:43.380 "flush": true, 00:13:43.380 "reset": true, 00:13:43.380 "nvme_admin": false, 00:13:43.380 "nvme_io": false, 00:13:43.380 "nvme_io_md": false, 00:13:43.380 "write_zeroes": true, 00:13:43.380 "zcopy": true, 00:13:43.380 "get_zone_info": false, 00:13:43.380 "zone_management": false, 00:13:43.380 "zone_append": false, 00:13:43.380 "compare": false, 00:13:43.380 "compare_and_write": false, 00:13:43.380 "abort": true, 00:13:43.380 "seek_hole": false, 00:13:43.380 "seek_data": false, 00:13:43.380 "copy": true, 00:13:43.380 "nvme_iov_md": false 00:13:43.380 }, 00:13:43.380 "memory_domains": [ 00:13:43.380 { 00:13:43.380 "dma_device_id": "system", 00:13:43.380 "dma_device_type": 1 00:13:43.380 }, 00:13:43.380 { 00:13:43.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:43.380 "dma_device_type": 2 00:13:43.380 } 00:13:43.380 ], 00:13:43.380 "driver_specific": {} 00:13:43.380 }' 00:13:43.380 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:43.380 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.639 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:43.929 [2024-07-25 13:22:24.620089] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:43.929 [2024-07-25 13:22:24.620108] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:43.929 [2024-07-25 13:22:24.620145] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:43.929 [2024-07-25 13:22:24.620181] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:43.929 [2024-07-25 13:22:24.620188] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f0a120 name Existed_Raid, state offline 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 894850 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 894850 ']' 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 894850 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 894850 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 894850' 00:13:43.929 killing process with pid 894850 00:13:43.929 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 894850 00:13:43.930 [2024-07-25 13:22:24.686370] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:43.930 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 894850 00:13:44.216 [2024-07-25 13:22:24.701307] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:44.216 00:13:44.216 real 0m25.185s 00:13:44.216 user 0m47.214s 00:13:44.216 sys 0m3.637s 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.216 ************************************ 00:13:44.216 END TEST raid_state_function_test 00:13:44.216 ************************************ 00:13:44.216 13:22:24 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:13:44.216 13:22:24 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:13:44.216 13:22:24 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:13:44.216 13:22:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:44.216 ************************************ 00:13:44.216 START TEST raid_state_function_test_sb 00:13:44.216 ************************************ 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 3 true 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=899741 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 899741' 00:13:44.216 Process raid pid: 899741 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 899741 /var/tmp/spdk-raid.sock 00:13:44.216 13:22:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:44.217 13:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 899741 ']' 00:13:44.217 13:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:44.217 13:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:13:44.217 13:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:44.217 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:44.217 13:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:13:44.217 13:22:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:44.217 [2024-07-25 13:22:24.960538] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:13:44.217 [2024-07-25 13:22:24.960601] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:44.477 [2024-07-25 13:22:25.047297] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:44.477 [2024-07-25 13:22:25.110078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:13:44.477 [2024-07-25 13:22:25.152578] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:44.477 [2024-07-25 13:22:25.152600] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:45.046 13:22:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:13:45.046 13:22:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:13:45.046 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:45.307 [2024-07-25 13:22:25.967622] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:45.307 [2024-07-25 13:22:25.967650] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:45.307 [2024-07-25 13:22:25.967657] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:45.307 [2024-07-25 13:22:25.967663] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:45.307 [2024-07-25 13:22:25.967667] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:45.307 [2024-07-25 13:22:25.967672] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.307 13:22:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.567 13:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.567 "name": "Existed_Raid", 00:13:45.567 "uuid": "c3973388-5fd5-4b84-b775-6569284979f9", 00:13:45.567 "strip_size_kb": 64, 00:13:45.567 "state": "configuring", 00:13:45.567 "raid_level": "raid0", 00:13:45.567 "superblock": true, 00:13:45.567 "num_base_bdevs": 3, 00:13:45.567 "num_base_bdevs_discovered": 0, 00:13:45.567 "num_base_bdevs_operational": 3, 00:13:45.567 "base_bdevs_list": [ 00:13:45.567 { 00:13:45.567 "name": "BaseBdev1", 00:13:45.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.567 "is_configured": false, 00:13:45.567 "data_offset": 0, 00:13:45.567 "data_size": 0 00:13:45.567 }, 00:13:45.567 { 00:13:45.567 "name": "BaseBdev2", 00:13:45.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.567 "is_configured": false, 00:13:45.567 "data_offset": 0, 00:13:45.567 "data_size": 0 00:13:45.567 }, 00:13:45.567 { 00:13:45.567 "name": "BaseBdev3", 00:13:45.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.567 "is_configured": false, 00:13:45.567 "data_offset": 0, 00:13:45.567 "data_size": 0 00:13:45.567 } 00:13:45.567 ] 00:13:45.567 }' 00:13:45.567 13:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.567 13:22:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:46.137 13:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:46.137 [2024-07-25 13:22:26.897856] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:46.137 [2024-07-25 13:22:26.897875] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22396d0 name Existed_Raid, state configuring 00:13:46.137 13:22:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:46.396 [2024-07-25 13:22:27.074328] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:46.396 [2024-07-25 13:22:27.074347] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:46.396 [2024-07-25 13:22:27.074352] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:46.396 [2024-07-25 13:22:27.074358] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:46.396 [2024-07-25 13:22:27.074362] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:46.396 [2024-07-25 13:22:27.074368] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:46.396 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:46.656 [2024-07-25 13:22:27.257373] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:46.656 BaseBdev1 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:46.656 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:46.915 [ 00:13:46.915 { 00:13:46.915 "name": "BaseBdev1", 00:13:46.915 "aliases": [ 00:13:46.915 "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0" 00:13:46.915 ], 00:13:46.915 "product_name": "Malloc disk", 00:13:46.915 "block_size": 512, 00:13:46.915 "num_blocks": 65536, 00:13:46.916 "uuid": "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0", 00:13:46.916 "assigned_rate_limits": { 00:13:46.916 "rw_ios_per_sec": 0, 00:13:46.916 "rw_mbytes_per_sec": 0, 00:13:46.916 "r_mbytes_per_sec": 0, 00:13:46.916 "w_mbytes_per_sec": 0 00:13:46.916 }, 00:13:46.916 "claimed": true, 00:13:46.916 "claim_type": "exclusive_write", 00:13:46.916 "zoned": false, 00:13:46.916 "supported_io_types": { 00:13:46.916 "read": true, 00:13:46.916 "write": true, 00:13:46.916 "unmap": true, 00:13:46.916 "flush": true, 00:13:46.916 "reset": true, 00:13:46.916 "nvme_admin": false, 00:13:46.916 "nvme_io": false, 00:13:46.916 "nvme_io_md": false, 00:13:46.916 "write_zeroes": true, 00:13:46.916 "zcopy": true, 00:13:46.916 "get_zone_info": false, 00:13:46.916 "zone_management": false, 00:13:46.916 "zone_append": false, 00:13:46.916 "compare": false, 00:13:46.916 "compare_and_write": false, 00:13:46.916 "abort": true, 00:13:46.916 "seek_hole": false, 00:13:46.916 "seek_data": false, 00:13:46.916 "copy": true, 00:13:46.916 "nvme_iov_md": false 00:13:46.916 }, 00:13:46.916 "memory_domains": [ 00:13:46.916 { 00:13:46.916 "dma_device_id": "system", 00:13:46.916 "dma_device_type": 1 00:13:46.916 }, 00:13:46.916 { 00:13:46.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:46.916 "dma_device_type": 2 00:13:46.916 } 00:13:46.916 ], 00:13:46.916 "driver_specific": {} 00:13:46.916 } 00:13:46.916 ] 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.916 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.176 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.176 "name": "Existed_Raid", 00:13:47.176 "uuid": "40e7b590-ec16-4177-911b-391bec3f9ec7", 00:13:47.176 "strip_size_kb": 64, 00:13:47.176 "state": "configuring", 00:13:47.176 "raid_level": "raid0", 00:13:47.176 "superblock": true, 00:13:47.176 "num_base_bdevs": 3, 00:13:47.176 "num_base_bdevs_discovered": 1, 00:13:47.176 "num_base_bdevs_operational": 3, 00:13:47.176 "base_bdevs_list": [ 00:13:47.176 { 00:13:47.176 "name": "BaseBdev1", 00:13:47.176 "uuid": "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0", 00:13:47.176 "is_configured": true, 00:13:47.176 "data_offset": 2048, 00:13:47.176 "data_size": 63488 00:13:47.176 }, 00:13:47.176 { 00:13:47.176 "name": "BaseBdev2", 00:13:47.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.176 "is_configured": false, 00:13:47.176 "data_offset": 0, 00:13:47.176 "data_size": 0 00:13:47.176 }, 00:13:47.176 { 00:13:47.176 "name": "BaseBdev3", 00:13:47.176 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:47.176 "is_configured": false, 00:13:47.176 "data_offset": 0, 00:13:47.176 "data_size": 0 00:13:47.176 } 00:13:47.176 ] 00:13:47.176 }' 00:13:47.176 13:22:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.176 13:22:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:47.745 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:47.745 [2024-07-25 13:22:28.532588] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:47.745 [2024-07-25 13:22:28.532615] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2238fa0 name Existed_Raid, state configuring 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:48.005 [2024-07-25 13:22:28.729120] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:48.005 [2024-07-25 13:22:28.730230] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:48.005 [2024-07-25 13:22:28.730254] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:48.005 [2024-07-25 13:22:28.730260] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:48.005 [2024-07-25 13:22:28.730266] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:48.005 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:48.265 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:48.265 "name": "Existed_Raid", 00:13:48.265 "uuid": "734fe83c-d162-4c15-b877-ec724bd5fc1b", 00:13:48.265 "strip_size_kb": 64, 00:13:48.265 "state": "configuring", 00:13:48.265 "raid_level": "raid0", 00:13:48.265 "superblock": true, 00:13:48.265 "num_base_bdevs": 3, 00:13:48.265 "num_base_bdevs_discovered": 1, 00:13:48.265 "num_base_bdevs_operational": 3, 00:13:48.265 "base_bdevs_list": [ 00:13:48.265 { 00:13:48.265 "name": "BaseBdev1", 00:13:48.265 "uuid": "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0", 00:13:48.265 "is_configured": true, 00:13:48.265 "data_offset": 2048, 00:13:48.265 "data_size": 63488 00:13:48.265 }, 00:13:48.265 { 00:13:48.265 "name": "BaseBdev2", 00:13:48.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.265 "is_configured": false, 00:13:48.265 "data_offset": 0, 00:13:48.265 "data_size": 0 00:13:48.265 }, 00:13:48.265 { 00:13:48.265 "name": "BaseBdev3", 00:13:48.265 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:48.265 "is_configured": false, 00:13:48.265 "data_offset": 0, 00:13:48.265 "data_size": 0 00:13:48.265 } 00:13:48.265 ] 00:13:48.265 }' 00:13:48.265 13:22:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:48.265 13:22:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:48.836 13:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:49.096 [2024-07-25 13:22:29.676483] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:49.096 BaseBdev2 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:49.096 13:22:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:49.357 [ 00:13:49.357 { 00:13:49.357 "name": "BaseBdev2", 00:13:49.357 "aliases": [ 00:13:49.357 "3b46ec55-6a58-4422-a0ad-878c5f43291d" 00:13:49.357 ], 00:13:49.357 "product_name": "Malloc disk", 00:13:49.357 "block_size": 512, 00:13:49.357 "num_blocks": 65536, 00:13:49.357 "uuid": "3b46ec55-6a58-4422-a0ad-878c5f43291d", 00:13:49.357 "assigned_rate_limits": { 00:13:49.357 "rw_ios_per_sec": 0, 00:13:49.357 "rw_mbytes_per_sec": 0, 00:13:49.357 "r_mbytes_per_sec": 0, 00:13:49.357 "w_mbytes_per_sec": 0 00:13:49.357 }, 00:13:49.357 "claimed": true, 00:13:49.357 "claim_type": "exclusive_write", 00:13:49.357 "zoned": false, 00:13:49.357 "supported_io_types": { 00:13:49.357 "read": true, 00:13:49.357 "write": true, 00:13:49.357 "unmap": true, 00:13:49.357 "flush": true, 00:13:49.357 "reset": true, 00:13:49.357 "nvme_admin": false, 00:13:49.357 "nvme_io": false, 00:13:49.357 "nvme_io_md": false, 00:13:49.357 "write_zeroes": true, 00:13:49.357 "zcopy": true, 00:13:49.357 "get_zone_info": false, 00:13:49.357 "zone_management": false, 00:13:49.357 "zone_append": false, 00:13:49.357 "compare": false, 00:13:49.357 "compare_and_write": false, 00:13:49.357 "abort": true, 00:13:49.357 "seek_hole": false, 00:13:49.357 "seek_data": false, 00:13:49.357 "copy": true, 00:13:49.357 "nvme_iov_md": false 00:13:49.357 }, 00:13:49.357 "memory_domains": [ 00:13:49.357 { 00:13:49.357 "dma_device_id": "system", 00:13:49.357 "dma_device_type": 1 00:13:49.357 }, 00:13:49.357 { 00:13:49.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.357 "dma_device_type": 2 00:13:49.357 } 00:13:49.357 ], 00:13:49.357 "driver_specific": {} 00:13:49.357 } 00:13:49.357 ] 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:49.357 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:49.617 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:49.617 "name": "Existed_Raid", 00:13:49.617 "uuid": "734fe83c-d162-4c15-b877-ec724bd5fc1b", 00:13:49.617 "strip_size_kb": 64, 00:13:49.617 "state": "configuring", 00:13:49.617 "raid_level": "raid0", 00:13:49.617 "superblock": true, 00:13:49.617 "num_base_bdevs": 3, 00:13:49.617 "num_base_bdevs_discovered": 2, 00:13:49.617 "num_base_bdevs_operational": 3, 00:13:49.617 "base_bdevs_list": [ 00:13:49.617 { 00:13:49.617 "name": "BaseBdev1", 00:13:49.617 "uuid": "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0", 00:13:49.617 "is_configured": true, 00:13:49.617 "data_offset": 2048, 00:13:49.617 "data_size": 63488 00:13:49.617 }, 00:13:49.617 { 00:13:49.617 "name": "BaseBdev2", 00:13:49.617 "uuid": "3b46ec55-6a58-4422-a0ad-878c5f43291d", 00:13:49.617 "is_configured": true, 00:13:49.617 "data_offset": 2048, 00:13:49.617 "data_size": 63488 00:13:49.617 }, 00:13:49.617 { 00:13:49.617 "name": "BaseBdev3", 00:13:49.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:49.617 "is_configured": false, 00:13:49.617 "data_offset": 0, 00:13:49.617 "data_size": 0 00:13:49.617 } 00:13:49.617 ] 00:13:49.617 }' 00:13:49.617 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:49.617 13:22:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:50.186 13:22:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:50.445 [2024-07-25 13:22:30.996665] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:50.445 [2024-07-25 13:22:30.996784] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2239ea0 00:13:50.445 [2024-07-25 13:22:30.996792] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:50.445 [2024-07-25 13:22:30.996930] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2239b70 00:13:50.445 [2024-07-25 13:22:30.997019] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2239ea0 00:13:50.445 [2024-07-25 13:22:30.997024] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2239ea0 00:13:50.445 [2024-07-25 13:22:30.997095] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:50.445 BaseBdev3 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:50.445 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:50.704 [ 00:13:50.704 { 00:13:50.705 "name": "BaseBdev3", 00:13:50.705 "aliases": [ 00:13:50.705 "a935405e-029a-4461-b2ab-18fbc3a30a0e" 00:13:50.705 ], 00:13:50.705 "product_name": "Malloc disk", 00:13:50.705 "block_size": 512, 00:13:50.705 "num_blocks": 65536, 00:13:50.705 "uuid": "a935405e-029a-4461-b2ab-18fbc3a30a0e", 00:13:50.705 "assigned_rate_limits": { 00:13:50.705 "rw_ios_per_sec": 0, 00:13:50.705 "rw_mbytes_per_sec": 0, 00:13:50.705 "r_mbytes_per_sec": 0, 00:13:50.705 "w_mbytes_per_sec": 0 00:13:50.705 }, 00:13:50.705 "claimed": true, 00:13:50.705 "claim_type": "exclusive_write", 00:13:50.705 "zoned": false, 00:13:50.705 "supported_io_types": { 00:13:50.705 "read": true, 00:13:50.705 "write": true, 00:13:50.705 "unmap": true, 00:13:50.705 "flush": true, 00:13:50.705 "reset": true, 00:13:50.705 "nvme_admin": false, 00:13:50.705 "nvme_io": false, 00:13:50.705 "nvme_io_md": false, 00:13:50.705 "write_zeroes": true, 00:13:50.705 "zcopy": true, 00:13:50.705 "get_zone_info": false, 00:13:50.705 "zone_management": false, 00:13:50.705 "zone_append": false, 00:13:50.705 "compare": false, 00:13:50.705 "compare_and_write": false, 00:13:50.705 "abort": true, 00:13:50.705 "seek_hole": false, 00:13:50.705 "seek_data": false, 00:13:50.705 "copy": true, 00:13:50.705 "nvme_iov_md": false 00:13:50.705 }, 00:13:50.705 "memory_domains": [ 00:13:50.705 { 00:13:50.705 "dma_device_id": "system", 00:13:50.705 "dma_device_type": 1 00:13:50.705 }, 00:13:50.705 { 00:13:50.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:50.705 "dma_device_type": 2 00:13:50.705 } 00:13:50.705 ], 00:13:50.705 "driver_specific": {} 00:13:50.705 } 00:13:50.705 ] 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:50.705 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:50.965 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:50.965 "name": "Existed_Raid", 00:13:50.965 "uuid": "734fe83c-d162-4c15-b877-ec724bd5fc1b", 00:13:50.965 "strip_size_kb": 64, 00:13:50.965 "state": "online", 00:13:50.965 "raid_level": "raid0", 00:13:50.965 "superblock": true, 00:13:50.965 "num_base_bdevs": 3, 00:13:50.965 "num_base_bdevs_discovered": 3, 00:13:50.965 "num_base_bdevs_operational": 3, 00:13:50.965 "base_bdevs_list": [ 00:13:50.965 { 00:13:50.965 "name": "BaseBdev1", 00:13:50.965 "uuid": "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0", 00:13:50.965 "is_configured": true, 00:13:50.965 "data_offset": 2048, 00:13:50.965 "data_size": 63488 00:13:50.965 }, 00:13:50.965 { 00:13:50.965 "name": "BaseBdev2", 00:13:50.965 "uuid": "3b46ec55-6a58-4422-a0ad-878c5f43291d", 00:13:50.965 "is_configured": true, 00:13:50.965 "data_offset": 2048, 00:13:50.965 "data_size": 63488 00:13:50.965 }, 00:13:50.965 { 00:13:50.965 "name": "BaseBdev3", 00:13:50.966 "uuid": "a935405e-029a-4461-b2ab-18fbc3a30a0e", 00:13:50.966 "is_configured": true, 00:13:50.966 "data_offset": 2048, 00:13:50.966 "data_size": 63488 00:13:50.966 } 00:13:50.966 ] 00:13:50.966 }' 00:13:50.966 13:22:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:50.966 13:22:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:51.535 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:51.535 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:51.536 [2024-07-25 13:22:32.276132] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:51.536 "name": "Existed_Raid", 00:13:51.536 "aliases": [ 00:13:51.536 "734fe83c-d162-4c15-b877-ec724bd5fc1b" 00:13:51.536 ], 00:13:51.536 "product_name": "Raid Volume", 00:13:51.536 "block_size": 512, 00:13:51.536 "num_blocks": 190464, 00:13:51.536 "uuid": "734fe83c-d162-4c15-b877-ec724bd5fc1b", 00:13:51.536 "assigned_rate_limits": { 00:13:51.536 "rw_ios_per_sec": 0, 00:13:51.536 "rw_mbytes_per_sec": 0, 00:13:51.536 "r_mbytes_per_sec": 0, 00:13:51.536 "w_mbytes_per_sec": 0 00:13:51.536 }, 00:13:51.536 "claimed": false, 00:13:51.536 "zoned": false, 00:13:51.536 "supported_io_types": { 00:13:51.536 "read": true, 00:13:51.536 "write": true, 00:13:51.536 "unmap": true, 00:13:51.536 "flush": true, 00:13:51.536 "reset": true, 00:13:51.536 "nvme_admin": false, 00:13:51.536 "nvme_io": false, 00:13:51.536 "nvme_io_md": false, 00:13:51.536 "write_zeroes": true, 00:13:51.536 "zcopy": false, 00:13:51.536 "get_zone_info": false, 00:13:51.536 "zone_management": false, 00:13:51.536 "zone_append": false, 00:13:51.536 "compare": false, 00:13:51.536 "compare_and_write": false, 00:13:51.536 "abort": false, 00:13:51.536 "seek_hole": false, 00:13:51.536 "seek_data": false, 00:13:51.536 "copy": false, 00:13:51.536 "nvme_iov_md": false 00:13:51.536 }, 00:13:51.536 "memory_domains": [ 00:13:51.536 { 00:13:51.536 "dma_device_id": "system", 00:13:51.536 "dma_device_type": 1 00:13:51.536 }, 00:13:51.536 { 00:13:51.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.536 "dma_device_type": 2 00:13:51.536 }, 00:13:51.536 { 00:13:51.536 "dma_device_id": "system", 00:13:51.536 "dma_device_type": 1 00:13:51.536 }, 00:13:51.536 { 00:13:51.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.536 "dma_device_type": 2 00:13:51.536 }, 00:13:51.536 { 00:13:51.536 "dma_device_id": "system", 00:13:51.536 "dma_device_type": 1 00:13:51.536 }, 00:13:51.536 { 00:13:51.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.536 "dma_device_type": 2 00:13:51.536 } 00:13:51.536 ], 00:13:51.536 "driver_specific": { 00:13:51.536 "raid": { 00:13:51.536 "uuid": "734fe83c-d162-4c15-b877-ec724bd5fc1b", 00:13:51.536 "strip_size_kb": 64, 00:13:51.536 "state": "online", 00:13:51.536 "raid_level": "raid0", 00:13:51.536 "superblock": true, 00:13:51.536 "num_base_bdevs": 3, 00:13:51.536 "num_base_bdevs_discovered": 3, 00:13:51.536 "num_base_bdevs_operational": 3, 00:13:51.536 "base_bdevs_list": [ 00:13:51.536 { 00:13:51.536 "name": "BaseBdev1", 00:13:51.536 "uuid": "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0", 00:13:51.536 "is_configured": true, 00:13:51.536 "data_offset": 2048, 00:13:51.536 "data_size": 63488 00:13:51.536 }, 00:13:51.536 { 00:13:51.536 "name": "BaseBdev2", 00:13:51.536 "uuid": "3b46ec55-6a58-4422-a0ad-878c5f43291d", 00:13:51.536 "is_configured": true, 00:13:51.536 "data_offset": 2048, 00:13:51.536 "data_size": 63488 00:13:51.536 }, 00:13:51.536 { 00:13:51.536 "name": "BaseBdev3", 00:13:51.536 "uuid": "a935405e-029a-4461-b2ab-18fbc3a30a0e", 00:13:51.536 "is_configured": true, 00:13:51.536 "data_offset": 2048, 00:13:51.536 "data_size": 63488 00:13:51.536 } 00:13:51.536 ] 00:13:51.536 } 00:13:51.536 } 00:13:51.536 }' 00:13:51.536 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:51.796 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:51.796 BaseBdev2 00:13:51.796 BaseBdev3' 00:13:51.796 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:51.796 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:51.796 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:51.796 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:51.796 "name": "BaseBdev1", 00:13:51.796 "aliases": [ 00:13:51.796 "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0" 00:13:51.796 ], 00:13:51.796 "product_name": "Malloc disk", 00:13:51.796 "block_size": 512, 00:13:51.796 "num_blocks": 65536, 00:13:51.796 "uuid": "b361cdbe-1504-47f9-ac0d-3395b4a1ddb0", 00:13:51.796 "assigned_rate_limits": { 00:13:51.796 "rw_ios_per_sec": 0, 00:13:51.796 "rw_mbytes_per_sec": 0, 00:13:51.796 "r_mbytes_per_sec": 0, 00:13:51.796 "w_mbytes_per_sec": 0 00:13:51.796 }, 00:13:51.796 "claimed": true, 00:13:51.796 "claim_type": "exclusive_write", 00:13:51.796 "zoned": false, 00:13:51.796 "supported_io_types": { 00:13:51.796 "read": true, 00:13:51.796 "write": true, 00:13:51.796 "unmap": true, 00:13:51.796 "flush": true, 00:13:51.796 "reset": true, 00:13:51.796 "nvme_admin": false, 00:13:51.796 "nvme_io": false, 00:13:51.796 "nvme_io_md": false, 00:13:51.796 "write_zeroes": true, 00:13:51.796 "zcopy": true, 00:13:51.796 "get_zone_info": false, 00:13:51.796 "zone_management": false, 00:13:51.796 "zone_append": false, 00:13:51.796 "compare": false, 00:13:51.796 "compare_and_write": false, 00:13:51.796 "abort": true, 00:13:51.796 "seek_hole": false, 00:13:51.796 "seek_data": false, 00:13:51.796 "copy": true, 00:13:51.796 "nvme_iov_md": false 00:13:51.796 }, 00:13:51.796 "memory_domains": [ 00:13:51.796 { 00:13:51.796 "dma_device_id": "system", 00:13:51.796 "dma_device_type": 1 00:13:51.796 }, 00:13:51.796 { 00:13:51.796 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:51.796 "dma_device_type": 2 00:13:51.796 } 00:13:51.796 ], 00:13:51.796 "driver_specific": {} 00:13:51.796 }' 00:13:51.796 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:51.796 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.055 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.315 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.315 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.315 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:52.315 13:22:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.315 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.315 "name": "BaseBdev2", 00:13:52.315 "aliases": [ 00:13:52.315 "3b46ec55-6a58-4422-a0ad-878c5f43291d" 00:13:52.315 ], 00:13:52.315 "product_name": "Malloc disk", 00:13:52.315 "block_size": 512, 00:13:52.315 "num_blocks": 65536, 00:13:52.315 "uuid": "3b46ec55-6a58-4422-a0ad-878c5f43291d", 00:13:52.315 "assigned_rate_limits": { 00:13:52.315 "rw_ios_per_sec": 0, 00:13:52.315 "rw_mbytes_per_sec": 0, 00:13:52.315 "r_mbytes_per_sec": 0, 00:13:52.315 "w_mbytes_per_sec": 0 00:13:52.315 }, 00:13:52.315 "claimed": true, 00:13:52.315 "claim_type": "exclusive_write", 00:13:52.315 "zoned": false, 00:13:52.315 "supported_io_types": { 00:13:52.315 "read": true, 00:13:52.315 "write": true, 00:13:52.315 "unmap": true, 00:13:52.315 "flush": true, 00:13:52.315 "reset": true, 00:13:52.315 "nvme_admin": false, 00:13:52.315 "nvme_io": false, 00:13:52.315 "nvme_io_md": false, 00:13:52.315 "write_zeroes": true, 00:13:52.315 "zcopy": true, 00:13:52.315 "get_zone_info": false, 00:13:52.315 "zone_management": false, 00:13:52.315 "zone_append": false, 00:13:52.315 "compare": false, 00:13:52.315 "compare_and_write": false, 00:13:52.315 "abort": true, 00:13:52.315 "seek_hole": false, 00:13:52.315 "seek_data": false, 00:13:52.315 "copy": true, 00:13:52.315 "nvme_iov_md": false 00:13:52.315 }, 00:13:52.315 "memory_domains": [ 00:13:52.315 { 00:13:52.315 "dma_device_id": "system", 00:13:52.315 "dma_device_type": 1 00:13:52.315 }, 00:13:52.315 { 00:13:52.315 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.315 "dma_device_type": 2 00:13:52.315 } 00:13:52.315 ], 00:13:52.315 "driver_specific": {} 00:13:52.315 }' 00:13:52.315 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:52.575 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.834 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:52.834 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:52.834 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:52.834 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:52.834 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:52.834 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:52.834 "name": "BaseBdev3", 00:13:52.834 "aliases": [ 00:13:52.834 "a935405e-029a-4461-b2ab-18fbc3a30a0e" 00:13:52.834 ], 00:13:52.834 "product_name": "Malloc disk", 00:13:52.834 "block_size": 512, 00:13:52.834 "num_blocks": 65536, 00:13:52.834 "uuid": "a935405e-029a-4461-b2ab-18fbc3a30a0e", 00:13:52.834 "assigned_rate_limits": { 00:13:52.834 "rw_ios_per_sec": 0, 00:13:52.834 "rw_mbytes_per_sec": 0, 00:13:52.834 "r_mbytes_per_sec": 0, 00:13:52.834 "w_mbytes_per_sec": 0 00:13:52.834 }, 00:13:52.834 "claimed": true, 00:13:52.834 "claim_type": "exclusive_write", 00:13:52.834 "zoned": false, 00:13:52.834 "supported_io_types": { 00:13:52.834 "read": true, 00:13:52.834 "write": true, 00:13:52.834 "unmap": true, 00:13:52.834 "flush": true, 00:13:52.834 "reset": true, 00:13:52.834 "nvme_admin": false, 00:13:52.834 "nvme_io": false, 00:13:52.834 "nvme_io_md": false, 00:13:52.834 "write_zeroes": true, 00:13:52.834 "zcopy": true, 00:13:52.834 "get_zone_info": false, 00:13:52.834 "zone_management": false, 00:13:52.834 "zone_append": false, 00:13:52.834 "compare": false, 00:13:52.834 "compare_and_write": false, 00:13:52.834 "abort": true, 00:13:52.834 "seek_hole": false, 00:13:52.834 "seek_data": false, 00:13:52.834 "copy": true, 00:13:52.834 "nvme_iov_md": false 00:13:52.834 }, 00:13:52.834 "memory_domains": [ 00:13:52.834 { 00:13:52.834 "dma_device_id": "system", 00:13:52.834 "dma_device_type": 1 00:13:52.834 }, 00:13:52.834 { 00:13:52.834 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:52.834 "dma_device_type": 2 00:13:52.834 } 00:13:52.834 ], 00:13:52.834 "driver_specific": {} 00:13:52.834 }' 00:13:52.834 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:53.093 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.353 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:53.353 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:53.353 13:22:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:53.612 [2024-07-25 13:22:34.148682] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:53.612 [2024-07-25 13:22:34.148700] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:53.612 [2024-07-25 13:22:34.148732] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:53.612 "name": "Existed_Raid", 00:13:53.612 "uuid": "734fe83c-d162-4c15-b877-ec724bd5fc1b", 00:13:53.612 "strip_size_kb": 64, 00:13:53.612 "state": "offline", 00:13:53.612 "raid_level": "raid0", 00:13:53.612 "superblock": true, 00:13:53.612 "num_base_bdevs": 3, 00:13:53.612 "num_base_bdevs_discovered": 2, 00:13:53.612 "num_base_bdevs_operational": 2, 00:13:53.612 "base_bdevs_list": [ 00:13:53.612 { 00:13:53.612 "name": null, 00:13:53.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:53.612 "is_configured": false, 00:13:53.612 "data_offset": 2048, 00:13:53.612 "data_size": 63488 00:13:53.612 }, 00:13:53.612 { 00:13:53.612 "name": "BaseBdev2", 00:13:53.612 "uuid": "3b46ec55-6a58-4422-a0ad-878c5f43291d", 00:13:53.612 "is_configured": true, 00:13:53.612 "data_offset": 2048, 00:13:53.612 "data_size": 63488 00:13:53.612 }, 00:13:53.612 { 00:13:53.612 "name": "BaseBdev3", 00:13:53.612 "uuid": "a935405e-029a-4461-b2ab-18fbc3a30a0e", 00:13:53.612 "is_configured": true, 00:13:53.612 "data_offset": 2048, 00:13:53.612 "data_size": 63488 00:13:53.612 } 00:13:53.612 ] 00:13:53.612 }' 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:53.612 13:22:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:54.182 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:54.182 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.182 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.182 13:22:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:54.441 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:54.441 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:54.441 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:54.701 [2024-07-25 13:22:35.283552] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:54.701 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:54.701 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.701 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.701 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:54.961 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:54.961 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:54.961 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:54.961 [2024-07-25 13:22:35.670342] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:54.961 [2024-07-25 13:22:35.670370] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2239ea0 name Existed_Raid, state offline 00:13:54.961 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:54.961 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:54.961 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.961 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:55.221 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:55.221 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:55.221 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:55.221 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:55.221 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:55.221 13:22:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:55.481 BaseBdev2 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:55.481 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:55.741 [ 00:13:55.741 { 00:13:55.741 "name": "BaseBdev2", 00:13:55.741 "aliases": [ 00:13:55.741 "cc2f37e5-d4bc-4007-8b75-b062fca52ef9" 00:13:55.741 ], 00:13:55.741 "product_name": "Malloc disk", 00:13:55.741 "block_size": 512, 00:13:55.741 "num_blocks": 65536, 00:13:55.741 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:13:55.741 "assigned_rate_limits": { 00:13:55.741 "rw_ios_per_sec": 0, 00:13:55.741 "rw_mbytes_per_sec": 0, 00:13:55.741 "r_mbytes_per_sec": 0, 00:13:55.741 "w_mbytes_per_sec": 0 00:13:55.741 }, 00:13:55.741 "claimed": false, 00:13:55.741 "zoned": false, 00:13:55.741 "supported_io_types": { 00:13:55.741 "read": true, 00:13:55.741 "write": true, 00:13:55.741 "unmap": true, 00:13:55.741 "flush": true, 00:13:55.741 "reset": true, 00:13:55.741 "nvme_admin": false, 00:13:55.741 "nvme_io": false, 00:13:55.741 "nvme_io_md": false, 00:13:55.741 "write_zeroes": true, 00:13:55.741 "zcopy": true, 00:13:55.741 "get_zone_info": false, 00:13:55.741 "zone_management": false, 00:13:55.741 "zone_append": false, 00:13:55.741 "compare": false, 00:13:55.741 "compare_and_write": false, 00:13:55.741 "abort": true, 00:13:55.741 "seek_hole": false, 00:13:55.741 "seek_data": false, 00:13:55.741 "copy": true, 00:13:55.741 "nvme_iov_md": false 00:13:55.741 }, 00:13:55.741 "memory_domains": [ 00:13:55.741 { 00:13:55.741 "dma_device_id": "system", 00:13:55.741 "dma_device_type": 1 00:13:55.741 }, 00:13:55.741 { 00:13:55.741 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:55.741 "dma_device_type": 2 00:13:55.741 } 00:13:55.741 ], 00:13:55.741 "driver_specific": {} 00:13:55.741 } 00:13:55.741 ] 00:13:55.741 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:55.741 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:55.741 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:55.741 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:56.001 BaseBdev3 00:13:56.001 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:56.001 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:13:56.001 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:56.001 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:56.001 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:56.001 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:56.001 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.260 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:56.260 [ 00:13:56.260 { 00:13:56.260 "name": "BaseBdev3", 00:13:56.260 "aliases": [ 00:13:56.260 "ffa6410e-d05f-4283-a328-8941a44e507c" 00:13:56.260 ], 00:13:56.260 "product_name": "Malloc disk", 00:13:56.260 "block_size": 512, 00:13:56.260 "num_blocks": 65536, 00:13:56.260 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:13:56.260 "assigned_rate_limits": { 00:13:56.260 "rw_ios_per_sec": 0, 00:13:56.260 "rw_mbytes_per_sec": 0, 00:13:56.260 "r_mbytes_per_sec": 0, 00:13:56.260 "w_mbytes_per_sec": 0 00:13:56.260 }, 00:13:56.260 "claimed": false, 00:13:56.260 "zoned": false, 00:13:56.260 "supported_io_types": { 00:13:56.260 "read": true, 00:13:56.260 "write": true, 00:13:56.260 "unmap": true, 00:13:56.260 "flush": true, 00:13:56.260 "reset": true, 00:13:56.260 "nvme_admin": false, 00:13:56.260 "nvme_io": false, 00:13:56.260 "nvme_io_md": false, 00:13:56.260 "write_zeroes": true, 00:13:56.260 "zcopy": true, 00:13:56.260 "get_zone_info": false, 00:13:56.260 "zone_management": false, 00:13:56.260 "zone_append": false, 00:13:56.260 "compare": false, 00:13:56.260 "compare_and_write": false, 00:13:56.260 "abort": true, 00:13:56.260 "seek_hole": false, 00:13:56.260 "seek_data": false, 00:13:56.260 "copy": true, 00:13:56.260 "nvme_iov_md": false 00:13:56.260 }, 00:13:56.260 "memory_domains": [ 00:13:56.260 { 00:13:56.260 "dma_device_id": "system", 00:13:56.260 "dma_device_type": 1 00:13:56.260 }, 00:13:56.260 { 00:13:56.260 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:56.260 "dma_device_type": 2 00:13:56.260 } 00:13:56.260 ], 00:13:56.260 "driver_specific": {} 00:13:56.260 } 00:13:56.260 ] 00:13:56.260 13:22:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:56.260 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:56.260 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:56.260 13:22:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:56.520 [2024-07-25 13:22:37.154053] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:56.520 [2024-07-25 13:22:37.154080] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:56.520 [2024-07-25 13:22:37.154093] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:56.520 [2024-07-25 13:22:37.155101] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:56.520 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:56.780 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:56.780 "name": "Existed_Raid", 00:13:56.780 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:13:56.780 "strip_size_kb": 64, 00:13:56.780 "state": "configuring", 00:13:56.780 "raid_level": "raid0", 00:13:56.780 "superblock": true, 00:13:56.780 "num_base_bdevs": 3, 00:13:56.780 "num_base_bdevs_discovered": 2, 00:13:56.780 "num_base_bdevs_operational": 3, 00:13:56.780 "base_bdevs_list": [ 00:13:56.780 { 00:13:56.780 "name": "BaseBdev1", 00:13:56.780 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:56.780 "is_configured": false, 00:13:56.780 "data_offset": 0, 00:13:56.780 "data_size": 0 00:13:56.780 }, 00:13:56.780 { 00:13:56.780 "name": "BaseBdev2", 00:13:56.780 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:13:56.780 "is_configured": true, 00:13:56.780 "data_offset": 2048, 00:13:56.780 "data_size": 63488 00:13:56.780 }, 00:13:56.780 { 00:13:56.780 "name": "BaseBdev3", 00:13:56.780 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:13:56.780 "is_configured": true, 00:13:56.780 "data_offset": 2048, 00:13:56.780 "data_size": 63488 00:13:56.780 } 00:13:56.780 ] 00:13:56.780 }' 00:13:56.780 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:56.780 13:22:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.349 13:22:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:57.349 [2024-07-25 13:22:38.052306] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:57.349 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:57.349 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.349 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.349 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.350 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.609 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.609 "name": "Existed_Raid", 00:13:57.609 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:13:57.609 "strip_size_kb": 64, 00:13:57.609 "state": "configuring", 00:13:57.609 "raid_level": "raid0", 00:13:57.609 "superblock": true, 00:13:57.609 "num_base_bdevs": 3, 00:13:57.609 "num_base_bdevs_discovered": 1, 00:13:57.609 "num_base_bdevs_operational": 3, 00:13:57.609 "base_bdevs_list": [ 00:13:57.609 { 00:13:57.609 "name": "BaseBdev1", 00:13:57.609 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.609 "is_configured": false, 00:13:57.609 "data_offset": 0, 00:13:57.609 "data_size": 0 00:13:57.609 }, 00:13:57.609 { 00:13:57.609 "name": null, 00:13:57.609 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:13:57.609 "is_configured": false, 00:13:57.609 "data_offset": 2048, 00:13:57.609 "data_size": 63488 00:13:57.609 }, 00:13:57.609 { 00:13:57.609 "name": "BaseBdev3", 00:13:57.609 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:13:57.609 "is_configured": true, 00:13:57.609 "data_offset": 2048, 00:13:57.609 "data_size": 63488 00:13:57.609 } 00:13:57.609 ] 00:13:57.609 }' 00:13:57.609 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.609 13:22:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:58.180 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:58.180 13:22:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.439 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:58.439 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:58.439 [2024-07-25 13:22:39.180076] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:58.439 BaseBdev1 00:13:58.439 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:58.439 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:13:58.440 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:13:58.440 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:13:58.440 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:13:58.440 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:13:58.440 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.699 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:58.958 [ 00:13:58.958 { 00:13:58.958 "name": "BaseBdev1", 00:13:58.958 "aliases": [ 00:13:58.958 "ed15a956-1568-43f8-8f20-23fc753ad402" 00:13:58.958 ], 00:13:58.958 "product_name": "Malloc disk", 00:13:58.958 "block_size": 512, 00:13:58.958 "num_blocks": 65536, 00:13:58.958 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:13:58.958 "assigned_rate_limits": { 00:13:58.958 "rw_ios_per_sec": 0, 00:13:58.958 "rw_mbytes_per_sec": 0, 00:13:58.958 "r_mbytes_per_sec": 0, 00:13:58.958 "w_mbytes_per_sec": 0 00:13:58.958 }, 00:13:58.958 "claimed": true, 00:13:58.958 "claim_type": "exclusive_write", 00:13:58.958 "zoned": false, 00:13:58.958 "supported_io_types": { 00:13:58.958 "read": true, 00:13:58.958 "write": true, 00:13:58.958 "unmap": true, 00:13:58.958 "flush": true, 00:13:58.959 "reset": true, 00:13:58.959 "nvme_admin": false, 00:13:58.959 "nvme_io": false, 00:13:58.959 "nvme_io_md": false, 00:13:58.959 "write_zeroes": true, 00:13:58.959 "zcopy": true, 00:13:58.959 "get_zone_info": false, 00:13:58.959 "zone_management": false, 00:13:58.959 "zone_append": false, 00:13:58.959 "compare": false, 00:13:58.959 "compare_and_write": false, 00:13:58.959 "abort": true, 00:13:58.959 "seek_hole": false, 00:13:58.959 "seek_data": false, 00:13:58.959 "copy": true, 00:13:58.959 "nvme_iov_md": false 00:13:58.959 }, 00:13:58.959 "memory_domains": [ 00:13:58.959 { 00:13:58.959 "dma_device_id": "system", 00:13:58.959 "dma_device_type": 1 00:13:58.959 }, 00:13:58.959 { 00:13:58.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.959 "dma_device_type": 2 00:13:58.959 } 00:13:58.959 ], 00:13:58.959 "driver_specific": {} 00:13:58.959 } 00:13:58.959 ] 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.959 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.218 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.218 "name": "Existed_Raid", 00:13:59.218 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:13:59.218 "strip_size_kb": 64, 00:13:59.218 "state": "configuring", 00:13:59.218 "raid_level": "raid0", 00:13:59.218 "superblock": true, 00:13:59.218 "num_base_bdevs": 3, 00:13:59.218 "num_base_bdevs_discovered": 2, 00:13:59.218 "num_base_bdevs_operational": 3, 00:13:59.218 "base_bdevs_list": [ 00:13:59.218 { 00:13:59.218 "name": "BaseBdev1", 00:13:59.218 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:13:59.218 "is_configured": true, 00:13:59.218 "data_offset": 2048, 00:13:59.218 "data_size": 63488 00:13:59.218 }, 00:13:59.218 { 00:13:59.218 "name": null, 00:13:59.218 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:13:59.218 "is_configured": false, 00:13:59.218 "data_offset": 2048, 00:13:59.218 "data_size": 63488 00:13:59.218 }, 00:13:59.218 { 00:13:59.218 "name": "BaseBdev3", 00:13:59.218 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:13:59.218 "is_configured": true, 00:13:59.218 "data_offset": 2048, 00:13:59.218 "data_size": 63488 00:13:59.218 } 00:13:59.218 ] 00:13:59.218 }' 00:13:59.218 13:22:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.218 13:22:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.787 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:59.787 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:59.787 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:59.787 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:00.046 [2024-07-25 13:22:40.688301] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.046 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:00.305 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:00.305 "name": "Existed_Raid", 00:14:00.305 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:14:00.305 "strip_size_kb": 64, 00:14:00.305 "state": "configuring", 00:14:00.305 "raid_level": "raid0", 00:14:00.305 "superblock": true, 00:14:00.305 "num_base_bdevs": 3, 00:14:00.305 "num_base_bdevs_discovered": 1, 00:14:00.305 "num_base_bdevs_operational": 3, 00:14:00.305 "base_bdevs_list": [ 00:14:00.305 { 00:14:00.305 "name": "BaseBdev1", 00:14:00.305 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:00.305 "is_configured": true, 00:14:00.305 "data_offset": 2048, 00:14:00.305 "data_size": 63488 00:14:00.305 }, 00:14:00.305 { 00:14:00.305 "name": null, 00:14:00.305 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:14:00.305 "is_configured": false, 00:14:00.305 "data_offset": 2048, 00:14:00.305 "data_size": 63488 00:14:00.305 }, 00:14:00.305 { 00:14:00.305 "name": null, 00:14:00.305 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:14:00.305 "is_configured": false, 00:14:00.305 "data_offset": 2048, 00:14:00.305 "data_size": 63488 00:14:00.305 } 00:14:00.305 ] 00:14:00.305 }' 00:14:00.305 13:22:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:00.305 13:22:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:00.873 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:00.873 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:00.873 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:00.873 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:01.132 [2024-07-25 13:22:41.815163] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.132 13:22:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:01.391 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:01.391 "name": "Existed_Raid", 00:14:01.391 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:14:01.391 "strip_size_kb": 64, 00:14:01.391 "state": "configuring", 00:14:01.391 "raid_level": "raid0", 00:14:01.391 "superblock": true, 00:14:01.391 "num_base_bdevs": 3, 00:14:01.391 "num_base_bdevs_discovered": 2, 00:14:01.391 "num_base_bdevs_operational": 3, 00:14:01.391 "base_bdevs_list": [ 00:14:01.391 { 00:14:01.391 "name": "BaseBdev1", 00:14:01.391 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:01.391 "is_configured": true, 00:14:01.391 "data_offset": 2048, 00:14:01.391 "data_size": 63488 00:14:01.391 }, 00:14:01.391 { 00:14:01.391 "name": null, 00:14:01.391 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:14:01.391 "is_configured": false, 00:14:01.391 "data_offset": 2048, 00:14:01.391 "data_size": 63488 00:14:01.391 }, 00:14:01.391 { 00:14:01.391 "name": "BaseBdev3", 00:14:01.391 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:14:01.391 "is_configured": true, 00:14:01.391 "data_offset": 2048, 00:14:01.391 "data_size": 63488 00:14:01.391 } 00:14:01.391 ] 00:14:01.391 }' 00:14:01.391 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:01.391 13:22:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:01.960 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:01.960 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:02.219 [2024-07-25 13:22:42.954043] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.219 13:22:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.479 13:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.479 "name": "Existed_Raid", 00:14:02.479 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:14:02.479 "strip_size_kb": 64, 00:14:02.479 "state": "configuring", 00:14:02.479 "raid_level": "raid0", 00:14:02.479 "superblock": true, 00:14:02.479 "num_base_bdevs": 3, 00:14:02.479 "num_base_bdevs_discovered": 1, 00:14:02.479 "num_base_bdevs_operational": 3, 00:14:02.479 "base_bdevs_list": [ 00:14:02.479 { 00:14:02.479 "name": null, 00:14:02.479 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:02.479 "is_configured": false, 00:14:02.479 "data_offset": 2048, 00:14:02.479 "data_size": 63488 00:14:02.479 }, 00:14:02.479 { 00:14:02.479 "name": null, 00:14:02.479 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:14:02.479 "is_configured": false, 00:14:02.479 "data_offset": 2048, 00:14:02.479 "data_size": 63488 00:14:02.479 }, 00:14:02.479 { 00:14:02.479 "name": "BaseBdev3", 00:14:02.479 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:14:02.479 "is_configured": true, 00:14:02.479 "data_offset": 2048, 00:14:02.479 "data_size": 63488 00:14:02.479 } 00:14:02.479 ] 00:14:02.479 }' 00:14:02.479 13:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.479 13:22:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.048 13:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.048 13:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:03.307 13:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:03.307 13:22:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:03.307 [2024-07-25 13:22:44.030542] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.307 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:03.566 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:03.566 "name": "Existed_Raid", 00:14:03.566 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:14:03.566 "strip_size_kb": 64, 00:14:03.566 "state": "configuring", 00:14:03.566 "raid_level": "raid0", 00:14:03.566 "superblock": true, 00:14:03.566 "num_base_bdevs": 3, 00:14:03.566 "num_base_bdevs_discovered": 2, 00:14:03.566 "num_base_bdevs_operational": 3, 00:14:03.566 "base_bdevs_list": [ 00:14:03.566 { 00:14:03.566 "name": null, 00:14:03.566 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:03.566 "is_configured": false, 00:14:03.566 "data_offset": 2048, 00:14:03.566 "data_size": 63488 00:14:03.566 }, 00:14:03.566 { 00:14:03.566 "name": "BaseBdev2", 00:14:03.566 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:14:03.566 "is_configured": true, 00:14:03.566 "data_offset": 2048, 00:14:03.566 "data_size": 63488 00:14:03.566 }, 00:14:03.566 { 00:14:03.566 "name": "BaseBdev3", 00:14:03.566 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:14:03.566 "is_configured": true, 00:14:03.566 "data_offset": 2048, 00:14:03.566 "data_size": 63488 00:14:03.566 } 00:14:03.566 ] 00:14:03.566 }' 00:14:03.566 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:03.566 13:22:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:04.134 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.134 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:04.394 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:04.394 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.394 13:22:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:04.394 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u ed15a956-1568-43f8-8f20-23fc753ad402 00:14:04.653 [2024-07-25 13:22:45.354924] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:04.653 [2024-07-25 13:22:45.355031] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x223ab40 00:14:04.653 [2024-07-25 13:22:45.355038] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:04.653 [2024-07-25 13:22:45.355169] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2239e70 00:14:04.653 [2024-07-25 13:22:45.355254] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x223ab40 00:14:04.653 [2024-07-25 13:22:45.355259] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x223ab40 00:14:04.653 [2024-07-25 13:22:45.355325] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:04.653 NewBaseBdev 00:14:04.654 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:04.654 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:14:04.654 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:04.654 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:14:04.654 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:04.654 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:04.654 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.912 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:05.172 [ 00:14:05.172 { 00:14:05.172 "name": "NewBaseBdev", 00:14:05.172 "aliases": [ 00:14:05.172 "ed15a956-1568-43f8-8f20-23fc753ad402" 00:14:05.172 ], 00:14:05.172 "product_name": "Malloc disk", 00:14:05.172 "block_size": 512, 00:14:05.172 "num_blocks": 65536, 00:14:05.172 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:05.172 "assigned_rate_limits": { 00:14:05.172 "rw_ios_per_sec": 0, 00:14:05.172 "rw_mbytes_per_sec": 0, 00:14:05.172 "r_mbytes_per_sec": 0, 00:14:05.172 "w_mbytes_per_sec": 0 00:14:05.172 }, 00:14:05.172 "claimed": true, 00:14:05.172 "claim_type": "exclusive_write", 00:14:05.172 "zoned": false, 00:14:05.172 "supported_io_types": { 00:14:05.172 "read": true, 00:14:05.172 "write": true, 00:14:05.172 "unmap": true, 00:14:05.172 "flush": true, 00:14:05.172 "reset": true, 00:14:05.172 "nvme_admin": false, 00:14:05.172 "nvme_io": false, 00:14:05.172 "nvme_io_md": false, 00:14:05.172 "write_zeroes": true, 00:14:05.172 "zcopy": true, 00:14:05.172 "get_zone_info": false, 00:14:05.172 "zone_management": false, 00:14:05.172 "zone_append": false, 00:14:05.172 "compare": false, 00:14:05.172 "compare_and_write": false, 00:14:05.172 "abort": true, 00:14:05.172 "seek_hole": false, 00:14:05.172 "seek_data": false, 00:14:05.172 "copy": true, 00:14:05.172 "nvme_iov_md": false 00:14:05.172 }, 00:14:05.172 "memory_domains": [ 00:14:05.172 { 00:14:05.172 "dma_device_id": "system", 00:14:05.172 "dma_device_type": 1 00:14:05.172 }, 00:14:05.172 { 00:14:05.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.172 "dma_device_type": 2 00:14:05.172 } 00:14:05.172 ], 00:14:05.172 "driver_specific": {} 00:14:05.172 } 00:14:05.172 ] 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:05.172 "name": "Existed_Raid", 00:14:05.172 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:14:05.172 "strip_size_kb": 64, 00:14:05.172 "state": "online", 00:14:05.172 "raid_level": "raid0", 00:14:05.172 "superblock": true, 00:14:05.172 "num_base_bdevs": 3, 00:14:05.172 "num_base_bdevs_discovered": 3, 00:14:05.172 "num_base_bdevs_operational": 3, 00:14:05.172 "base_bdevs_list": [ 00:14:05.172 { 00:14:05.172 "name": "NewBaseBdev", 00:14:05.172 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:05.172 "is_configured": true, 00:14:05.172 "data_offset": 2048, 00:14:05.172 "data_size": 63488 00:14:05.172 }, 00:14:05.172 { 00:14:05.172 "name": "BaseBdev2", 00:14:05.172 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:14:05.172 "is_configured": true, 00:14:05.172 "data_offset": 2048, 00:14:05.172 "data_size": 63488 00:14:05.172 }, 00:14:05.172 { 00:14:05.172 "name": "BaseBdev3", 00:14:05.172 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:14:05.172 "is_configured": true, 00:14:05.172 "data_offset": 2048, 00:14:05.172 "data_size": 63488 00:14:05.172 } 00:14:05.172 ] 00:14:05.172 }' 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:05.172 13:22:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:05.739 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:05.998 [2024-07-25 13:22:46.662507] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:05.998 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:05.998 "name": "Existed_Raid", 00:14:05.998 "aliases": [ 00:14:05.998 "0d4ca5fa-2079-454b-91e9-cac435fcfacb" 00:14:05.998 ], 00:14:05.998 "product_name": "Raid Volume", 00:14:05.998 "block_size": 512, 00:14:05.998 "num_blocks": 190464, 00:14:05.998 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:14:05.998 "assigned_rate_limits": { 00:14:05.998 "rw_ios_per_sec": 0, 00:14:05.998 "rw_mbytes_per_sec": 0, 00:14:05.998 "r_mbytes_per_sec": 0, 00:14:05.998 "w_mbytes_per_sec": 0 00:14:05.998 }, 00:14:05.998 "claimed": false, 00:14:05.998 "zoned": false, 00:14:05.998 "supported_io_types": { 00:14:05.998 "read": true, 00:14:05.998 "write": true, 00:14:05.998 "unmap": true, 00:14:05.998 "flush": true, 00:14:05.998 "reset": true, 00:14:05.998 "nvme_admin": false, 00:14:05.998 "nvme_io": false, 00:14:05.998 "nvme_io_md": false, 00:14:05.998 "write_zeroes": true, 00:14:05.998 "zcopy": false, 00:14:05.998 "get_zone_info": false, 00:14:05.998 "zone_management": false, 00:14:05.998 "zone_append": false, 00:14:05.998 "compare": false, 00:14:05.998 "compare_and_write": false, 00:14:05.998 "abort": false, 00:14:05.998 "seek_hole": false, 00:14:05.998 "seek_data": false, 00:14:05.998 "copy": false, 00:14:05.998 "nvme_iov_md": false 00:14:05.998 }, 00:14:05.998 "memory_domains": [ 00:14:05.998 { 00:14:05.998 "dma_device_id": "system", 00:14:05.998 "dma_device_type": 1 00:14:05.998 }, 00:14:05.998 { 00:14:05.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.999 "dma_device_type": 2 00:14:05.999 }, 00:14:05.999 { 00:14:05.999 "dma_device_id": "system", 00:14:05.999 "dma_device_type": 1 00:14:05.999 }, 00:14:05.999 { 00:14:05.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.999 "dma_device_type": 2 00:14:05.999 }, 00:14:05.999 { 00:14:05.999 "dma_device_id": "system", 00:14:05.999 "dma_device_type": 1 00:14:05.999 }, 00:14:05.999 { 00:14:05.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.999 "dma_device_type": 2 00:14:05.999 } 00:14:05.999 ], 00:14:05.999 "driver_specific": { 00:14:05.999 "raid": { 00:14:05.999 "uuid": "0d4ca5fa-2079-454b-91e9-cac435fcfacb", 00:14:05.999 "strip_size_kb": 64, 00:14:05.999 "state": "online", 00:14:05.999 "raid_level": "raid0", 00:14:05.999 "superblock": true, 00:14:05.999 "num_base_bdevs": 3, 00:14:05.999 "num_base_bdevs_discovered": 3, 00:14:05.999 "num_base_bdevs_operational": 3, 00:14:05.999 "base_bdevs_list": [ 00:14:05.999 { 00:14:05.999 "name": "NewBaseBdev", 00:14:05.999 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:05.999 "is_configured": true, 00:14:05.999 "data_offset": 2048, 00:14:05.999 "data_size": 63488 00:14:05.999 }, 00:14:05.999 { 00:14:05.999 "name": "BaseBdev2", 00:14:05.999 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:14:05.999 "is_configured": true, 00:14:05.999 "data_offset": 2048, 00:14:05.999 "data_size": 63488 00:14:05.999 }, 00:14:05.999 { 00:14:05.999 "name": "BaseBdev3", 00:14:05.999 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:14:05.999 "is_configured": true, 00:14:05.999 "data_offset": 2048, 00:14:05.999 "data_size": 63488 00:14:05.999 } 00:14:05.999 ] 00:14:05.999 } 00:14:05.999 } 00:14:05.999 }' 00:14:05.999 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:05.999 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:05.999 BaseBdev2 00:14:05.999 BaseBdev3' 00:14:05.999 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:05.999 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:05.999 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.258 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.258 "name": "NewBaseBdev", 00:14:06.258 "aliases": [ 00:14:06.258 "ed15a956-1568-43f8-8f20-23fc753ad402" 00:14:06.258 ], 00:14:06.258 "product_name": "Malloc disk", 00:14:06.258 "block_size": 512, 00:14:06.258 "num_blocks": 65536, 00:14:06.258 "uuid": "ed15a956-1568-43f8-8f20-23fc753ad402", 00:14:06.258 "assigned_rate_limits": { 00:14:06.258 "rw_ios_per_sec": 0, 00:14:06.258 "rw_mbytes_per_sec": 0, 00:14:06.258 "r_mbytes_per_sec": 0, 00:14:06.258 "w_mbytes_per_sec": 0 00:14:06.258 }, 00:14:06.258 "claimed": true, 00:14:06.258 "claim_type": "exclusive_write", 00:14:06.258 "zoned": false, 00:14:06.258 "supported_io_types": { 00:14:06.258 "read": true, 00:14:06.258 "write": true, 00:14:06.258 "unmap": true, 00:14:06.258 "flush": true, 00:14:06.258 "reset": true, 00:14:06.258 "nvme_admin": false, 00:14:06.258 "nvme_io": false, 00:14:06.258 "nvme_io_md": false, 00:14:06.258 "write_zeroes": true, 00:14:06.258 "zcopy": true, 00:14:06.258 "get_zone_info": false, 00:14:06.258 "zone_management": false, 00:14:06.258 "zone_append": false, 00:14:06.258 "compare": false, 00:14:06.258 "compare_and_write": false, 00:14:06.258 "abort": true, 00:14:06.258 "seek_hole": false, 00:14:06.258 "seek_data": false, 00:14:06.258 "copy": true, 00:14:06.258 "nvme_iov_md": false 00:14:06.258 }, 00:14:06.258 "memory_domains": [ 00:14:06.258 { 00:14:06.258 "dma_device_id": "system", 00:14:06.258 "dma_device_type": 1 00:14:06.258 }, 00:14:06.258 { 00:14:06.258 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.258 "dma_device_type": 2 00:14:06.258 } 00:14:06.258 ], 00:14:06.258 "driver_specific": {} 00:14:06.258 }' 00:14:06.258 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.259 13:22:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.259 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.259 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:06.517 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:06.777 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:06.777 "name": "BaseBdev2", 00:14:06.777 "aliases": [ 00:14:06.777 "cc2f37e5-d4bc-4007-8b75-b062fca52ef9" 00:14:06.777 ], 00:14:06.777 "product_name": "Malloc disk", 00:14:06.777 "block_size": 512, 00:14:06.777 "num_blocks": 65536, 00:14:06.777 "uuid": "cc2f37e5-d4bc-4007-8b75-b062fca52ef9", 00:14:06.777 "assigned_rate_limits": { 00:14:06.777 "rw_ios_per_sec": 0, 00:14:06.777 "rw_mbytes_per_sec": 0, 00:14:06.777 "r_mbytes_per_sec": 0, 00:14:06.777 "w_mbytes_per_sec": 0 00:14:06.777 }, 00:14:06.777 "claimed": true, 00:14:06.777 "claim_type": "exclusive_write", 00:14:06.777 "zoned": false, 00:14:06.777 "supported_io_types": { 00:14:06.777 "read": true, 00:14:06.777 "write": true, 00:14:06.777 "unmap": true, 00:14:06.777 "flush": true, 00:14:06.777 "reset": true, 00:14:06.777 "nvme_admin": false, 00:14:06.777 "nvme_io": false, 00:14:06.777 "nvme_io_md": false, 00:14:06.777 "write_zeroes": true, 00:14:06.777 "zcopy": true, 00:14:06.777 "get_zone_info": false, 00:14:06.777 "zone_management": false, 00:14:06.777 "zone_append": false, 00:14:06.777 "compare": false, 00:14:06.777 "compare_and_write": false, 00:14:06.777 "abort": true, 00:14:06.777 "seek_hole": false, 00:14:06.777 "seek_data": false, 00:14:06.777 "copy": true, 00:14:06.777 "nvme_iov_md": false 00:14:06.777 }, 00:14:06.777 "memory_domains": [ 00:14:06.777 { 00:14:06.777 "dma_device_id": "system", 00:14:06.777 "dma_device_type": 1 00:14:06.777 }, 00:14:06.777 { 00:14:06.777 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:06.777 "dma_device_type": 2 00:14:06.777 } 00:14:06.777 ], 00:14:06.777 "driver_specific": {} 00:14:06.777 }' 00:14:06.777 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.777 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:06.777 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:06.777 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:07.037 13:22:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:07.297 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:07.297 "name": "BaseBdev3", 00:14:07.297 "aliases": [ 00:14:07.297 "ffa6410e-d05f-4283-a328-8941a44e507c" 00:14:07.297 ], 00:14:07.297 "product_name": "Malloc disk", 00:14:07.297 "block_size": 512, 00:14:07.297 "num_blocks": 65536, 00:14:07.297 "uuid": "ffa6410e-d05f-4283-a328-8941a44e507c", 00:14:07.297 "assigned_rate_limits": { 00:14:07.297 "rw_ios_per_sec": 0, 00:14:07.297 "rw_mbytes_per_sec": 0, 00:14:07.297 "r_mbytes_per_sec": 0, 00:14:07.297 "w_mbytes_per_sec": 0 00:14:07.297 }, 00:14:07.297 "claimed": true, 00:14:07.297 "claim_type": "exclusive_write", 00:14:07.297 "zoned": false, 00:14:07.297 "supported_io_types": { 00:14:07.297 "read": true, 00:14:07.297 "write": true, 00:14:07.297 "unmap": true, 00:14:07.297 "flush": true, 00:14:07.297 "reset": true, 00:14:07.297 "nvme_admin": false, 00:14:07.297 "nvme_io": false, 00:14:07.297 "nvme_io_md": false, 00:14:07.297 "write_zeroes": true, 00:14:07.297 "zcopy": true, 00:14:07.297 "get_zone_info": false, 00:14:07.297 "zone_management": false, 00:14:07.297 "zone_append": false, 00:14:07.297 "compare": false, 00:14:07.297 "compare_and_write": false, 00:14:07.297 "abort": true, 00:14:07.297 "seek_hole": false, 00:14:07.297 "seek_data": false, 00:14:07.297 "copy": true, 00:14:07.297 "nvme_iov_md": false 00:14:07.297 }, 00:14:07.297 "memory_domains": [ 00:14:07.297 { 00:14:07.297 "dma_device_id": "system", 00:14:07.297 "dma_device_type": 1 00:14:07.297 }, 00:14:07.297 { 00:14:07.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:07.297 "dma_device_type": 2 00:14:07.297 } 00:14:07.297 ], 00:14:07.297 "driver_specific": {} 00:14:07.297 }' 00:14:07.297 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.297 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:07.557 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:07.837 [2024-07-25 13:22:48.498945] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:07.837 [2024-07-25 13:22:48.498962] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:07.837 [2024-07-25 13:22:48.498998] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:07.837 [2024-07-25 13:22:48.499034] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:07.837 [2024-07-25 13:22:48.499040] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x223ab40 name Existed_Raid, state offline 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 899741 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 899741 ']' 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 899741 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 899741 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 899741' 00:14:07.837 killing process with pid 899741 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 899741 00:14:07.837 [2024-07-25 13:22:48.565919] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:07.837 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 899741 00:14:07.837 [2024-07-25 13:22:48.580713] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:08.151 13:22:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:08.151 00:14:08.151 real 0m23.800s 00:14:08.151 user 0m44.662s 00:14:08.151 sys 0m3.467s 00:14:08.151 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:08.151 13:22:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.151 ************************************ 00:14:08.151 END TEST raid_state_function_test_sb 00:14:08.151 ************************************ 00:14:08.151 13:22:48 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:14:08.151 13:22:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:14:08.151 13:22:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:08.151 13:22:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:08.151 ************************************ 00:14:08.151 START TEST raid_superblock_test 00:14:08.151 ************************************ 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 3 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:14:08.151 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=904325 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 904325 /var/tmp/spdk-raid.sock 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 904325 ']' 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:08.152 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:08.152 13:22:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:08.152 [2024-07-25 13:22:48.833186] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:14:08.152 [2024-07-25 13:22:48.833239] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid904325 ] 00:14:08.152 [2024-07-25 13:22:48.922356] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:08.412 [2024-07-25 13:22:48.988742] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:08.412 [2024-07-25 13:22:49.027438] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:08.412 [2024-07-25 13:22:49.027461] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:08.983 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:09.244 malloc1 00:14:09.244 13:22:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:09.244 [2024-07-25 13:22:50.029283] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:09.244 [2024-07-25 13:22:50.029318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:09.244 [2024-07-25 13:22:50.029330] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10799b0 00:14:09.244 [2024-07-25 13:22:50.029337] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:09.244 [2024-07-25 13:22:50.030733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:09.244 [2024-07-25 13:22:50.030753] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:09.244 pt1 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:09.504 malloc2 00:14:09.504 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:09.764 [2024-07-25 13:22:50.400386] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:09.764 [2024-07-25 13:22:50.400417] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:09.764 [2024-07-25 13:22:50.400426] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x107adb0 00:14:09.764 [2024-07-25 13:22:50.400432] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:09.764 [2024-07-25 13:22:50.401685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:09.764 [2024-07-25 13:22:50.401705] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:09.764 pt2 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:09.764 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:10.024 malloc3 00:14:10.025 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:10.025 [2024-07-25 13:22:50.783342] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:10.025 [2024-07-25 13:22:50.783370] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:10.025 [2024-07-25 13:22:50.783379] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1211780 00:14:10.025 [2024-07-25 13:22:50.783385] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:10.025 [2024-07-25 13:22:50.784579] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:10.025 [2024-07-25 13:22:50.784598] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:10.025 pt3 00:14:10.025 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:14:10.025 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:14:10.025 13:22:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:10.284 [2024-07-25 13:22:50.975850] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:10.284 [2024-07-25 13:22:50.976837] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:10.284 [2024-07-25 13:22:50.976878] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:10.284 [2024-07-25 13:22:50.976981] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x10722e0 00:14:10.284 [2024-07-25 13:22:50.976988] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:10.285 [2024-07-25 13:22:50.977136] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1079680 00:14:10.285 [2024-07-25 13:22:50.977240] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10722e0 00:14:10.285 [2024-07-25 13:22:50.977245] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10722e0 00:14:10.285 [2024-07-25 13:22:50.977322] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.285 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:10.545 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.545 "name": "raid_bdev1", 00:14:10.545 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:10.545 "strip_size_kb": 64, 00:14:10.545 "state": "online", 00:14:10.545 "raid_level": "raid0", 00:14:10.545 "superblock": true, 00:14:10.545 "num_base_bdevs": 3, 00:14:10.545 "num_base_bdevs_discovered": 3, 00:14:10.545 "num_base_bdevs_operational": 3, 00:14:10.545 "base_bdevs_list": [ 00:14:10.545 { 00:14:10.545 "name": "pt1", 00:14:10.545 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:10.545 "is_configured": true, 00:14:10.545 "data_offset": 2048, 00:14:10.545 "data_size": 63488 00:14:10.545 }, 00:14:10.545 { 00:14:10.545 "name": "pt2", 00:14:10.545 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:10.545 "is_configured": true, 00:14:10.545 "data_offset": 2048, 00:14:10.545 "data_size": 63488 00:14:10.545 }, 00:14:10.545 { 00:14:10.545 "name": "pt3", 00:14:10.545 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:10.545 "is_configured": true, 00:14:10.545 "data_offset": 2048, 00:14:10.545 "data_size": 63488 00:14:10.545 } 00:14:10.545 ] 00:14:10.545 }' 00:14:10.545 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.545 13:22:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:11.115 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:11.375 [2024-07-25 13:22:51.930460] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:11.375 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:11.375 "name": "raid_bdev1", 00:14:11.375 "aliases": [ 00:14:11.375 "ff9f47f2-780a-4430-b652-9b2c85d94c3a" 00:14:11.375 ], 00:14:11.375 "product_name": "Raid Volume", 00:14:11.375 "block_size": 512, 00:14:11.375 "num_blocks": 190464, 00:14:11.375 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:11.375 "assigned_rate_limits": { 00:14:11.375 "rw_ios_per_sec": 0, 00:14:11.375 "rw_mbytes_per_sec": 0, 00:14:11.375 "r_mbytes_per_sec": 0, 00:14:11.375 "w_mbytes_per_sec": 0 00:14:11.375 }, 00:14:11.375 "claimed": false, 00:14:11.375 "zoned": false, 00:14:11.375 "supported_io_types": { 00:14:11.375 "read": true, 00:14:11.375 "write": true, 00:14:11.375 "unmap": true, 00:14:11.375 "flush": true, 00:14:11.375 "reset": true, 00:14:11.375 "nvme_admin": false, 00:14:11.375 "nvme_io": false, 00:14:11.375 "nvme_io_md": false, 00:14:11.375 "write_zeroes": true, 00:14:11.375 "zcopy": false, 00:14:11.375 "get_zone_info": false, 00:14:11.375 "zone_management": false, 00:14:11.375 "zone_append": false, 00:14:11.375 "compare": false, 00:14:11.375 "compare_and_write": false, 00:14:11.375 "abort": false, 00:14:11.375 "seek_hole": false, 00:14:11.375 "seek_data": false, 00:14:11.375 "copy": false, 00:14:11.375 "nvme_iov_md": false 00:14:11.375 }, 00:14:11.375 "memory_domains": [ 00:14:11.375 { 00:14:11.375 "dma_device_id": "system", 00:14:11.375 "dma_device_type": 1 00:14:11.375 }, 00:14:11.375 { 00:14:11.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.375 "dma_device_type": 2 00:14:11.375 }, 00:14:11.375 { 00:14:11.375 "dma_device_id": "system", 00:14:11.375 "dma_device_type": 1 00:14:11.375 }, 00:14:11.375 { 00:14:11.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.375 "dma_device_type": 2 00:14:11.375 }, 00:14:11.375 { 00:14:11.375 "dma_device_id": "system", 00:14:11.375 "dma_device_type": 1 00:14:11.375 }, 00:14:11.375 { 00:14:11.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.375 "dma_device_type": 2 00:14:11.375 } 00:14:11.375 ], 00:14:11.375 "driver_specific": { 00:14:11.375 "raid": { 00:14:11.375 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:11.375 "strip_size_kb": 64, 00:14:11.375 "state": "online", 00:14:11.375 "raid_level": "raid0", 00:14:11.375 "superblock": true, 00:14:11.375 "num_base_bdevs": 3, 00:14:11.375 "num_base_bdevs_discovered": 3, 00:14:11.375 "num_base_bdevs_operational": 3, 00:14:11.375 "base_bdevs_list": [ 00:14:11.375 { 00:14:11.375 "name": "pt1", 00:14:11.375 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.375 "is_configured": true, 00:14:11.375 "data_offset": 2048, 00:14:11.375 "data_size": 63488 00:14:11.375 }, 00:14:11.375 { 00:14:11.375 "name": "pt2", 00:14:11.375 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:11.375 "is_configured": true, 00:14:11.375 "data_offset": 2048, 00:14:11.375 "data_size": 63488 00:14:11.375 }, 00:14:11.375 { 00:14:11.375 "name": "pt3", 00:14:11.375 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:11.375 "is_configured": true, 00:14:11.375 "data_offset": 2048, 00:14:11.375 "data_size": 63488 00:14:11.375 } 00:14:11.375 ] 00:14:11.375 } 00:14:11.375 } 00:14:11.375 }' 00:14:11.375 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:11.375 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:11.375 pt2 00:14:11.375 pt3' 00:14:11.375 13:22:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.375 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:11.375 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:11.635 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:11.635 "name": "pt1", 00:14:11.635 "aliases": [ 00:14:11.635 "00000000-0000-0000-0000-000000000001" 00:14:11.635 ], 00:14:11.635 "product_name": "passthru", 00:14:11.635 "block_size": 512, 00:14:11.635 "num_blocks": 65536, 00:14:11.635 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:11.635 "assigned_rate_limits": { 00:14:11.635 "rw_ios_per_sec": 0, 00:14:11.635 "rw_mbytes_per_sec": 0, 00:14:11.635 "r_mbytes_per_sec": 0, 00:14:11.635 "w_mbytes_per_sec": 0 00:14:11.635 }, 00:14:11.635 "claimed": true, 00:14:11.635 "claim_type": "exclusive_write", 00:14:11.636 "zoned": false, 00:14:11.636 "supported_io_types": { 00:14:11.636 "read": true, 00:14:11.636 "write": true, 00:14:11.636 "unmap": true, 00:14:11.636 "flush": true, 00:14:11.636 "reset": true, 00:14:11.636 "nvme_admin": false, 00:14:11.636 "nvme_io": false, 00:14:11.636 "nvme_io_md": false, 00:14:11.636 "write_zeroes": true, 00:14:11.636 "zcopy": true, 00:14:11.636 "get_zone_info": false, 00:14:11.636 "zone_management": false, 00:14:11.636 "zone_append": false, 00:14:11.636 "compare": false, 00:14:11.636 "compare_and_write": false, 00:14:11.636 "abort": true, 00:14:11.636 "seek_hole": false, 00:14:11.636 "seek_data": false, 00:14:11.636 "copy": true, 00:14:11.636 "nvme_iov_md": false 00:14:11.636 }, 00:14:11.636 "memory_domains": [ 00:14:11.636 { 00:14:11.636 "dma_device_id": "system", 00:14:11.636 "dma_device_type": 1 00:14:11.636 }, 00:14:11.636 { 00:14:11.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:11.636 "dma_device_type": 2 00:14:11.636 } 00:14:11.636 ], 00:14:11.636 "driver_specific": { 00:14:11.636 "passthru": { 00:14:11.636 "name": "pt1", 00:14:11.636 "base_bdev_name": "malloc1" 00:14:11.636 } 00:14:11.636 } 00:14:11.636 }' 00:14:11.636 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:11.636 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:11.636 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:11.636 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.636 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:11.636 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:11.636 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.895 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:11.896 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:11.896 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.896 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:11.896 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:11.896 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:11.896 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:11.896 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.155 "name": "pt2", 00:14:12.155 "aliases": [ 00:14:12.155 "00000000-0000-0000-0000-000000000002" 00:14:12.155 ], 00:14:12.155 "product_name": "passthru", 00:14:12.155 "block_size": 512, 00:14:12.155 "num_blocks": 65536, 00:14:12.155 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:12.155 "assigned_rate_limits": { 00:14:12.155 "rw_ios_per_sec": 0, 00:14:12.155 "rw_mbytes_per_sec": 0, 00:14:12.155 "r_mbytes_per_sec": 0, 00:14:12.155 "w_mbytes_per_sec": 0 00:14:12.155 }, 00:14:12.155 "claimed": true, 00:14:12.155 "claim_type": "exclusive_write", 00:14:12.155 "zoned": false, 00:14:12.155 "supported_io_types": { 00:14:12.155 "read": true, 00:14:12.155 "write": true, 00:14:12.155 "unmap": true, 00:14:12.155 "flush": true, 00:14:12.155 "reset": true, 00:14:12.155 "nvme_admin": false, 00:14:12.155 "nvme_io": false, 00:14:12.155 "nvme_io_md": false, 00:14:12.155 "write_zeroes": true, 00:14:12.155 "zcopy": true, 00:14:12.155 "get_zone_info": false, 00:14:12.155 "zone_management": false, 00:14:12.155 "zone_append": false, 00:14:12.155 "compare": false, 00:14:12.155 "compare_and_write": false, 00:14:12.155 "abort": true, 00:14:12.155 "seek_hole": false, 00:14:12.155 "seek_data": false, 00:14:12.155 "copy": true, 00:14:12.155 "nvme_iov_md": false 00:14:12.155 }, 00:14:12.155 "memory_domains": [ 00:14:12.155 { 00:14:12.155 "dma_device_id": "system", 00:14:12.155 "dma_device_type": 1 00:14:12.155 }, 00:14:12.155 { 00:14:12.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.155 "dma_device_type": 2 00:14:12.155 } 00:14:12.155 ], 00:14:12.155 "driver_specific": { 00:14:12.155 "passthru": { 00:14:12.155 "name": "pt2", 00:14:12.155 "base_bdev_name": "malloc2" 00:14:12.155 } 00:14:12.155 } 00:14:12.155 }' 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.155 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.415 13:22:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.415 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.415 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.415 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.415 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.415 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:12.415 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:12.415 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:12.676 "name": "pt3", 00:14:12.676 "aliases": [ 00:14:12.676 "00000000-0000-0000-0000-000000000003" 00:14:12.676 ], 00:14:12.676 "product_name": "passthru", 00:14:12.676 "block_size": 512, 00:14:12.676 "num_blocks": 65536, 00:14:12.676 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:12.676 "assigned_rate_limits": { 00:14:12.676 "rw_ios_per_sec": 0, 00:14:12.676 "rw_mbytes_per_sec": 0, 00:14:12.676 "r_mbytes_per_sec": 0, 00:14:12.676 "w_mbytes_per_sec": 0 00:14:12.676 }, 00:14:12.676 "claimed": true, 00:14:12.676 "claim_type": "exclusive_write", 00:14:12.676 "zoned": false, 00:14:12.676 "supported_io_types": { 00:14:12.676 "read": true, 00:14:12.676 "write": true, 00:14:12.676 "unmap": true, 00:14:12.676 "flush": true, 00:14:12.676 "reset": true, 00:14:12.676 "nvme_admin": false, 00:14:12.676 "nvme_io": false, 00:14:12.676 "nvme_io_md": false, 00:14:12.676 "write_zeroes": true, 00:14:12.676 "zcopy": true, 00:14:12.676 "get_zone_info": false, 00:14:12.676 "zone_management": false, 00:14:12.676 "zone_append": false, 00:14:12.676 "compare": false, 00:14:12.676 "compare_and_write": false, 00:14:12.676 "abort": true, 00:14:12.676 "seek_hole": false, 00:14:12.676 "seek_data": false, 00:14:12.676 "copy": true, 00:14:12.676 "nvme_iov_md": false 00:14:12.676 }, 00:14:12.676 "memory_domains": [ 00:14:12.676 { 00:14:12.676 "dma_device_id": "system", 00:14:12.676 "dma_device_type": 1 00:14:12.676 }, 00:14:12.676 { 00:14:12.676 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:12.676 "dma_device_type": 2 00:14:12.676 } 00:14:12.676 ], 00:14:12.676 "driver_specific": { 00:14:12.676 "passthru": { 00:14:12.676 "name": "pt3", 00:14:12.676 "base_bdev_name": "malloc3" 00:14:12.676 } 00:14:12.676 } 00:14:12.676 }' 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:12.676 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.936 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:12.936 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:12.936 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.936 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:12.936 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:12.936 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:12.936 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:14:13.195 [2024-07-25 13:22:53.787155] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:13.195 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=ff9f47f2-780a-4430-b652-9b2c85d94c3a 00:14:13.195 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z ff9f47f2-780a-4430-b652-9b2c85d94c3a ']' 00:14:13.196 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:13.196 [2024-07-25 13:22:53.979423] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:13.196 [2024-07-25 13:22:53.979437] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:13.196 [2024-07-25 13:22:53.979472] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:13.196 [2024-07-25 13:22:53.979510] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:13.196 [2024-07-25 13:22:53.979516] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10722e0 name raid_bdev1, state offline 00:14:13.456 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.456 13:22:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:14:13.456 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:14:13.456 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:14:13.456 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:13.456 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:13.715 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:13.716 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:13.975 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:14:13.975 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:14.236 13:22:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:14.497 [2024-07-25 13:22:55.154351] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:14.497 [2024-07-25 13:22:55.155413] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:14.497 [2024-07-25 13:22:55.155446] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:14.497 [2024-07-25 13:22:55.155479] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:14.497 [2024-07-25 13:22:55.155505] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:14.497 [2024-07-25 13:22:55.155519] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:14.497 [2024-07-25 13:22:55.155534] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:14.497 [2024-07-25 13:22:55.155540] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1079e50 name raid_bdev1, state configuring 00:14:14.497 request: 00:14:14.497 { 00:14:14.497 "name": "raid_bdev1", 00:14:14.497 "raid_level": "raid0", 00:14:14.497 "base_bdevs": [ 00:14:14.497 "malloc1", 00:14:14.497 "malloc2", 00:14:14.497 "malloc3" 00:14:14.497 ], 00:14:14.497 "strip_size_kb": 64, 00:14:14.497 "superblock": false, 00:14:14.497 "method": "bdev_raid_create", 00:14:14.497 "req_id": 1 00:14:14.497 } 00:14:14.497 Got JSON-RPC error response 00:14:14.497 response: 00:14:14.497 { 00:14:14.497 "code": -17, 00:14:14.497 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:14.497 } 00:14:14.497 13:22:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:14:14.497 13:22:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:14:14.497 13:22:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:14:14.497 13:22:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:14:14.497 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:14:14.497 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:14.758 [2024-07-25 13:22:55.527246] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:14.758 [2024-07-25 13:22:55.527268] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:14.758 [2024-07-25 13:22:55.527281] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1079be0 00:14:14.758 [2024-07-25 13:22:55.527287] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:14.758 [2024-07-25 13:22:55.528542] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:14.758 [2024-07-25 13:22:55.528568] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:14.758 [2024-07-25 13:22:55.528614] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:14.758 [2024-07-25 13:22:55.528632] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:14.758 pt1 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.758 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:15.019 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:15.019 "name": "raid_bdev1", 00:14:15.019 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:15.019 "strip_size_kb": 64, 00:14:15.019 "state": "configuring", 00:14:15.019 "raid_level": "raid0", 00:14:15.019 "superblock": true, 00:14:15.019 "num_base_bdevs": 3, 00:14:15.019 "num_base_bdevs_discovered": 1, 00:14:15.019 "num_base_bdevs_operational": 3, 00:14:15.019 "base_bdevs_list": [ 00:14:15.019 { 00:14:15.019 "name": "pt1", 00:14:15.019 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:15.019 "is_configured": true, 00:14:15.019 "data_offset": 2048, 00:14:15.019 "data_size": 63488 00:14:15.019 }, 00:14:15.019 { 00:14:15.019 "name": null, 00:14:15.019 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:15.019 "is_configured": false, 00:14:15.019 "data_offset": 2048, 00:14:15.019 "data_size": 63488 00:14:15.019 }, 00:14:15.019 { 00:14:15.019 "name": null, 00:14:15.019 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:15.019 "is_configured": false, 00:14:15.019 "data_offset": 2048, 00:14:15.019 "data_size": 63488 00:14:15.019 } 00:14:15.019 ] 00:14:15.019 }' 00:14:15.019 13:22:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:15.019 13:22:55 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:15.589 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:14:15.589 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:15.849 [2024-07-25 13:22:56.429567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:15.849 [2024-07-25 13:22:56.429595] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:15.849 [2024-07-25 13:22:56.429605] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x121c960 00:14:15.849 [2024-07-25 13:22:56.429611] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:15.849 [2024-07-25 13:22:56.429867] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:15.849 [2024-07-25 13:22:56.429878] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:15.849 [2024-07-25 13:22:56.429919] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:15.849 [2024-07-25 13:22:56.429931] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:15.849 pt2 00:14:15.849 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:15.849 [2024-07-25 13:22:56.626058] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.109 "name": "raid_bdev1", 00:14:16.109 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:16.109 "strip_size_kb": 64, 00:14:16.109 "state": "configuring", 00:14:16.109 "raid_level": "raid0", 00:14:16.109 "superblock": true, 00:14:16.109 "num_base_bdevs": 3, 00:14:16.109 "num_base_bdevs_discovered": 1, 00:14:16.109 "num_base_bdevs_operational": 3, 00:14:16.109 "base_bdevs_list": [ 00:14:16.109 { 00:14:16.109 "name": "pt1", 00:14:16.109 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:16.109 "is_configured": true, 00:14:16.109 "data_offset": 2048, 00:14:16.109 "data_size": 63488 00:14:16.109 }, 00:14:16.109 { 00:14:16.109 "name": null, 00:14:16.109 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:16.109 "is_configured": false, 00:14:16.109 "data_offset": 2048, 00:14:16.109 "data_size": 63488 00:14:16.109 }, 00:14:16.109 { 00:14:16.109 "name": null, 00:14:16.109 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:16.109 "is_configured": false, 00:14:16.109 "data_offset": 2048, 00:14:16.109 "data_size": 63488 00:14:16.109 } 00:14:16.109 ] 00:14:16.109 }' 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.109 13:22:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:17.052 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:14:17.052 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:17.052 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:17.052 [2024-07-25 13:22:57.688746] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:17.052 [2024-07-25 13:22:57.688777] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.052 [2024-07-25 13:22:57.688786] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1072f60 00:14:17.052 [2024-07-25 13:22:57.688792] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.052 [2024-07-25 13:22:57.689057] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.052 [2024-07-25 13:22:57.689069] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:17.052 [2024-07-25 13:22:57.689113] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:17.052 [2024-07-25 13:22:57.689125] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:17.052 pt2 00:14:17.052 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:14:17.052 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:17.052 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:17.313 [2024-07-25 13:22:57.865193] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:17.313 [2024-07-25 13:22:57.865216] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:17.313 [2024-07-25 13:22:57.865224] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1072940 00:14:17.313 [2024-07-25 13:22:57.865230] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:17.313 [2024-07-25 13:22:57.865457] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:17.313 [2024-07-25 13:22:57.865466] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:17.313 [2024-07-25 13:22:57.865500] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:17.313 [2024-07-25 13:22:57.865510] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:17.313 [2024-07-25 13:22:57.865595] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x10704a0 00:14:17.313 [2024-07-25 13:22:57.865601] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:17.313 [2024-07-25 13:22:57.865729] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1212ea0 00:14:17.313 [2024-07-25 13:22:57.865826] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x10704a0 00:14:17.313 [2024-07-25 13:22:57.865831] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x10704a0 00:14:17.313 [2024-07-25 13:22:57.865901] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:17.313 pt3 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:17.313 13:22:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:17.313 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:17.313 "name": "raid_bdev1", 00:14:17.313 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:17.313 "strip_size_kb": 64, 00:14:17.313 "state": "online", 00:14:17.313 "raid_level": "raid0", 00:14:17.313 "superblock": true, 00:14:17.313 "num_base_bdevs": 3, 00:14:17.313 "num_base_bdevs_discovered": 3, 00:14:17.313 "num_base_bdevs_operational": 3, 00:14:17.313 "base_bdevs_list": [ 00:14:17.313 { 00:14:17.313 "name": "pt1", 00:14:17.313 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:17.313 "is_configured": true, 00:14:17.313 "data_offset": 2048, 00:14:17.313 "data_size": 63488 00:14:17.313 }, 00:14:17.313 { 00:14:17.313 "name": "pt2", 00:14:17.313 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:17.313 "is_configured": true, 00:14:17.313 "data_offset": 2048, 00:14:17.313 "data_size": 63488 00:14:17.313 }, 00:14:17.313 { 00:14:17.313 "name": "pt3", 00:14:17.313 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:17.313 "is_configured": true, 00:14:17.313 "data_offset": 2048, 00:14:17.313 "data_size": 63488 00:14:17.313 } 00:14:17.313 ] 00:14:17.313 }' 00:14:17.313 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:17.313 13:22:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:18.255 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:18.256 [2024-07-25 13:22:58.875968] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:18.256 "name": "raid_bdev1", 00:14:18.256 "aliases": [ 00:14:18.256 "ff9f47f2-780a-4430-b652-9b2c85d94c3a" 00:14:18.256 ], 00:14:18.256 "product_name": "Raid Volume", 00:14:18.256 "block_size": 512, 00:14:18.256 "num_blocks": 190464, 00:14:18.256 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:18.256 "assigned_rate_limits": { 00:14:18.256 "rw_ios_per_sec": 0, 00:14:18.256 "rw_mbytes_per_sec": 0, 00:14:18.256 "r_mbytes_per_sec": 0, 00:14:18.256 "w_mbytes_per_sec": 0 00:14:18.256 }, 00:14:18.256 "claimed": false, 00:14:18.256 "zoned": false, 00:14:18.256 "supported_io_types": { 00:14:18.256 "read": true, 00:14:18.256 "write": true, 00:14:18.256 "unmap": true, 00:14:18.256 "flush": true, 00:14:18.256 "reset": true, 00:14:18.256 "nvme_admin": false, 00:14:18.256 "nvme_io": false, 00:14:18.256 "nvme_io_md": false, 00:14:18.256 "write_zeroes": true, 00:14:18.256 "zcopy": false, 00:14:18.256 "get_zone_info": false, 00:14:18.256 "zone_management": false, 00:14:18.256 "zone_append": false, 00:14:18.256 "compare": false, 00:14:18.256 "compare_and_write": false, 00:14:18.256 "abort": false, 00:14:18.256 "seek_hole": false, 00:14:18.256 "seek_data": false, 00:14:18.256 "copy": false, 00:14:18.256 "nvme_iov_md": false 00:14:18.256 }, 00:14:18.256 "memory_domains": [ 00:14:18.256 { 00:14:18.256 "dma_device_id": "system", 00:14:18.256 "dma_device_type": 1 00:14:18.256 }, 00:14:18.256 { 00:14:18.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.256 "dma_device_type": 2 00:14:18.256 }, 00:14:18.256 { 00:14:18.256 "dma_device_id": "system", 00:14:18.256 "dma_device_type": 1 00:14:18.256 }, 00:14:18.256 { 00:14:18.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.256 "dma_device_type": 2 00:14:18.256 }, 00:14:18.256 { 00:14:18.256 "dma_device_id": "system", 00:14:18.256 "dma_device_type": 1 00:14:18.256 }, 00:14:18.256 { 00:14:18.256 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.256 "dma_device_type": 2 00:14:18.256 } 00:14:18.256 ], 00:14:18.256 "driver_specific": { 00:14:18.256 "raid": { 00:14:18.256 "uuid": "ff9f47f2-780a-4430-b652-9b2c85d94c3a", 00:14:18.256 "strip_size_kb": 64, 00:14:18.256 "state": "online", 00:14:18.256 "raid_level": "raid0", 00:14:18.256 "superblock": true, 00:14:18.256 "num_base_bdevs": 3, 00:14:18.256 "num_base_bdevs_discovered": 3, 00:14:18.256 "num_base_bdevs_operational": 3, 00:14:18.256 "base_bdevs_list": [ 00:14:18.256 { 00:14:18.256 "name": "pt1", 00:14:18.256 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:18.256 "is_configured": true, 00:14:18.256 "data_offset": 2048, 00:14:18.256 "data_size": 63488 00:14:18.256 }, 00:14:18.256 { 00:14:18.256 "name": "pt2", 00:14:18.256 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:18.256 "is_configured": true, 00:14:18.256 "data_offset": 2048, 00:14:18.256 "data_size": 63488 00:14:18.256 }, 00:14:18.256 { 00:14:18.256 "name": "pt3", 00:14:18.256 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:18.256 "is_configured": true, 00:14:18.256 "data_offset": 2048, 00:14:18.256 "data_size": 63488 00:14:18.256 } 00:14:18.256 ] 00:14:18.256 } 00:14:18.256 } 00:14:18.256 }' 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:18.256 pt2 00:14:18.256 pt3' 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:18.256 13:22:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.518 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.518 "name": "pt1", 00:14:18.518 "aliases": [ 00:14:18.518 "00000000-0000-0000-0000-000000000001" 00:14:18.518 ], 00:14:18.518 "product_name": "passthru", 00:14:18.518 "block_size": 512, 00:14:18.518 "num_blocks": 65536, 00:14:18.518 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:18.518 "assigned_rate_limits": { 00:14:18.518 "rw_ios_per_sec": 0, 00:14:18.518 "rw_mbytes_per_sec": 0, 00:14:18.518 "r_mbytes_per_sec": 0, 00:14:18.518 "w_mbytes_per_sec": 0 00:14:18.518 }, 00:14:18.518 "claimed": true, 00:14:18.518 "claim_type": "exclusive_write", 00:14:18.518 "zoned": false, 00:14:18.518 "supported_io_types": { 00:14:18.518 "read": true, 00:14:18.518 "write": true, 00:14:18.518 "unmap": true, 00:14:18.518 "flush": true, 00:14:18.518 "reset": true, 00:14:18.518 "nvme_admin": false, 00:14:18.518 "nvme_io": false, 00:14:18.518 "nvme_io_md": false, 00:14:18.518 "write_zeroes": true, 00:14:18.518 "zcopy": true, 00:14:18.518 "get_zone_info": false, 00:14:18.518 "zone_management": false, 00:14:18.518 "zone_append": false, 00:14:18.518 "compare": false, 00:14:18.518 "compare_and_write": false, 00:14:18.518 "abort": true, 00:14:18.518 "seek_hole": false, 00:14:18.518 "seek_data": false, 00:14:18.518 "copy": true, 00:14:18.518 "nvme_iov_md": false 00:14:18.518 }, 00:14:18.518 "memory_domains": [ 00:14:18.518 { 00:14:18.518 "dma_device_id": "system", 00:14:18.518 "dma_device_type": 1 00:14:18.518 }, 00:14:18.518 { 00:14:18.518 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.518 "dma_device_type": 2 00:14:18.518 } 00:14:18.518 ], 00:14:18.518 "driver_specific": { 00:14:18.518 "passthru": { 00:14:18.518 "name": "pt1", 00:14:18.518 "base_bdev_name": "malloc1" 00:14:18.518 } 00:14:18.518 } 00:14:18.518 }' 00:14:18.518 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.518 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.518 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.518 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.518 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:18.779 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:19.040 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:19.040 "name": "pt2", 00:14:19.040 "aliases": [ 00:14:19.040 "00000000-0000-0000-0000-000000000002" 00:14:19.040 ], 00:14:19.040 "product_name": "passthru", 00:14:19.040 "block_size": 512, 00:14:19.040 "num_blocks": 65536, 00:14:19.040 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:19.040 "assigned_rate_limits": { 00:14:19.040 "rw_ios_per_sec": 0, 00:14:19.040 "rw_mbytes_per_sec": 0, 00:14:19.040 "r_mbytes_per_sec": 0, 00:14:19.040 "w_mbytes_per_sec": 0 00:14:19.040 }, 00:14:19.040 "claimed": true, 00:14:19.040 "claim_type": "exclusive_write", 00:14:19.040 "zoned": false, 00:14:19.040 "supported_io_types": { 00:14:19.040 "read": true, 00:14:19.040 "write": true, 00:14:19.040 "unmap": true, 00:14:19.040 "flush": true, 00:14:19.040 "reset": true, 00:14:19.040 "nvme_admin": false, 00:14:19.040 "nvme_io": false, 00:14:19.040 "nvme_io_md": false, 00:14:19.040 "write_zeroes": true, 00:14:19.040 "zcopy": true, 00:14:19.040 "get_zone_info": false, 00:14:19.040 "zone_management": false, 00:14:19.040 "zone_append": false, 00:14:19.040 "compare": false, 00:14:19.040 "compare_and_write": false, 00:14:19.040 "abort": true, 00:14:19.040 "seek_hole": false, 00:14:19.040 "seek_data": false, 00:14:19.040 "copy": true, 00:14:19.040 "nvme_iov_md": false 00:14:19.040 }, 00:14:19.040 "memory_domains": [ 00:14:19.040 { 00:14:19.040 "dma_device_id": "system", 00:14:19.040 "dma_device_type": 1 00:14:19.040 }, 00:14:19.040 { 00:14:19.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.040 "dma_device_type": 2 00:14:19.040 } 00:14:19.040 ], 00:14:19.040 "driver_specific": { 00:14:19.040 "passthru": { 00:14:19.040 "name": "pt2", 00:14:19.040 "base_bdev_name": "malloc2" 00:14:19.040 } 00:14:19.040 } 00:14:19.040 }' 00:14:19.040 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.040 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.040 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:19.040 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.302 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.302 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:19.302 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.302 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.302 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:19.302 13:22:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.302 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.302 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.302 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:19.302 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:19.302 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:19.873 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:19.873 "name": "pt3", 00:14:19.873 "aliases": [ 00:14:19.873 "00000000-0000-0000-0000-000000000003" 00:14:19.873 ], 00:14:19.873 "product_name": "passthru", 00:14:19.873 "block_size": 512, 00:14:19.873 "num_blocks": 65536, 00:14:19.873 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:19.873 "assigned_rate_limits": { 00:14:19.873 "rw_ios_per_sec": 0, 00:14:19.873 "rw_mbytes_per_sec": 0, 00:14:19.873 "r_mbytes_per_sec": 0, 00:14:19.873 "w_mbytes_per_sec": 0 00:14:19.873 }, 00:14:19.873 "claimed": true, 00:14:19.873 "claim_type": "exclusive_write", 00:14:19.873 "zoned": false, 00:14:19.873 "supported_io_types": { 00:14:19.873 "read": true, 00:14:19.873 "write": true, 00:14:19.873 "unmap": true, 00:14:19.873 "flush": true, 00:14:19.873 "reset": true, 00:14:19.873 "nvme_admin": false, 00:14:19.873 "nvme_io": false, 00:14:19.873 "nvme_io_md": false, 00:14:19.873 "write_zeroes": true, 00:14:19.873 "zcopy": true, 00:14:19.873 "get_zone_info": false, 00:14:19.873 "zone_management": false, 00:14:19.873 "zone_append": false, 00:14:19.873 "compare": false, 00:14:19.873 "compare_and_write": false, 00:14:19.873 "abort": true, 00:14:19.873 "seek_hole": false, 00:14:19.873 "seek_data": false, 00:14:19.873 "copy": true, 00:14:19.873 "nvme_iov_md": false 00:14:19.873 }, 00:14:19.873 "memory_domains": [ 00:14:19.873 { 00:14:19.873 "dma_device_id": "system", 00:14:19.873 "dma_device_type": 1 00:14:19.873 }, 00:14:19.873 { 00:14:19.873 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:19.873 "dma_device_type": 2 00:14:19.873 } 00:14:19.873 ], 00:14:19.873 "driver_specific": { 00:14:19.873 "passthru": { 00:14:19.873 "name": "pt3", 00:14:19.873 "base_bdev_name": "malloc3" 00:14:19.873 } 00:14:19.873 } 00:14:19.873 }' 00:14:19.873 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.134 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:20.134 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:20.134 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.134 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:20.134 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:20.134 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.134 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:20.395 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:20.395 13:23:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.395 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:20.395 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:20.395 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:20.395 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:14:20.656 [2024-07-25 13:23:01.229947] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' ff9f47f2-780a-4430-b652-9b2c85d94c3a '!=' ff9f47f2-780a-4430-b652-9b2c85d94c3a ']' 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 904325 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 904325 ']' 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 904325 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 904325 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 904325' 00:14:20.656 killing process with pid 904325 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 904325 00:14:20.656 [2024-07-25 13:23:01.317438] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:20.656 [2024-07-25 13:23:01.317475] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:20.656 [2024-07-25 13:23:01.317511] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:20.656 [2024-07-25 13:23:01.317517] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x10704a0 name raid_bdev1, state offline 00:14:20.656 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 904325 00:14:20.656 [2024-07-25 13:23:01.332480] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:20.918 13:23:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:14:20.918 00:14:20.918 real 0m12.673s 00:14:20.918 user 0m23.381s 00:14:20.918 sys 0m1.839s 00:14:20.918 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:20.918 13:23:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.918 ************************************ 00:14:20.918 END TEST raid_superblock_test 00:14:20.918 ************************************ 00:14:20.918 13:23:01 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:14:20.918 13:23:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:20.918 13:23:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:20.918 13:23:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:20.918 ************************************ 00:14:20.918 START TEST raid_read_error_test 00:14:20.918 ************************************ 00:14:20.918 13:23:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 read 00:14:20.918 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:14:20.918 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:14:20.918 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:14:20.918 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:14:20.918 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:20.918 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.UgLNdxd62j 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=906745 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 906745 /var/tmp/spdk-raid.sock 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 906745 ']' 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:20.919 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:20.919 13:23:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.919 [2024-07-25 13:23:01.597416] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:14:20.919 [2024-07-25 13:23:01.597473] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid906745 ] 00:14:20.919 [2024-07-25 13:23:01.687482] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:21.181 [2024-07-25 13:23:01.763713] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:21.181 [2024-07-25 13:23:01.810686] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:21.181 [2024-07-25 13:23:01.810713] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:21.441 13:23:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:21.441 13:23:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:21.441 13:23:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:21.441 13:23:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:21.701 BaseBdev1_malloc 00:14:21.702 13:23:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:22.271 true 00:14:22.271 13:23:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:22.531 [2024-07-25 13:23:03.151012] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:22.531 [2024-07-25 13:23:03.151045] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.531 [2024-07-25 13:23:03.151056] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xeca2a0 00:14:22.531 [2024-07-25 13:23:03.151063] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.531 [2024-07-25 13:23:03.152384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.531 [2024-07-25 13:23:03.152405] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:22.531 BaseBdev1 00:14:22.531 13:23:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:22.531 13:23:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:22.794 BaseBdev2_malloc 00:14:22.794 13:23:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:23.365 true 00:14:23.365 13:23:03 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:23.624 [2024-07-25 13:23:04.219608] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:23.624 [2024-07-25 13:23:04.219638] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:23.624 [2024-07-25 13:23:04.219651] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf89420 00:14:23.625 [2024-07-25 13:23:04.219657] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:23.625 [2024-07-25 13:23:04.220846] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:23.625 [2024-07-25 13:23:04.220871] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:23.625 BaseBdev2 00:14:23.625 13:23:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:23.625 13:23:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:23.884 BaseBdev3_malloc 00:14:23.884 13:23:04 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:24.455 true 00:14:24.455 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:24.715 [2024-07-25 13:23:05.292169] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:24.715 [2024-07-25 13:23:05.292200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:24.715 [2024-07-25 13:23:05.292214] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8af70 00:14:24.715 [2024-07-25 13:23:05.292221] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:24.715 [2024-07-25 13:23:05.293425] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:24.715 [2024-07-25 13:23:05.293445] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:24.715 BaseBdev3 00:14:24.715 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:25.285 [2024-07-25 13:23:05.821508] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:25.285 [2024-07-25 13:23:05.822530] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:25.285 [2024-07-25 13:23:05.822593] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:25.285 [2024-07-25 13:23:05.822736] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xf8ccc0 00:14:25.285 [2024-07-25 13:23:05.822744] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:25.285 [2024-07-25 13:23:05.822895] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf8df70 00:14:25.285 [2024-07-25 13:23:05.823009] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf8ccc0 00:14:25.285 [2024-07-25 13:23:05.823014] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf8ccc0 00:14:25.285 [2024-07-25 13:23:05.823100] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:25.285 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:25.285 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.286 13:23:05 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:25.855 13:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:25.855 "name": "raid_bdev1", 00:14:25.855 "uuid": "806c2109-b953-4041-b8e6-60f373d90ac0", 00:14:25.855 "strip_size_kb": 64, 00:14:25.855 "state": "online", 00:14:25.855 "raid_level": "raid0", 00:14:25.855 "superblock": true, 00:14:25.855 "num_base_bdevs": 3, 00:14:25.855 "num_base_bdevs_discovered": 3, 00:14:25.855 "num_base_bdevs_operational": 3, 00:14:25.855 "base_bdevs_list": [ 00:14:25.855 { 00:14:25.855 "name": "BaseBdev1", 00:14:25.855 "uuid": "05d9d547-c979-5556-8d0b-37372ce6bc4f", 00:14:25.855 "is_configured": true, 00:14:25.855 "data_offset": 2048, 00:14:25.855 "data_size": 63488 00:14:25.855 }, 00:14:25.855 { 00:14:25.855 "name": "BaseBdev2", 00:14:25.855 "uuid": "d9af33df-0422-573a-b99d-e4ddf0a4c2c7", 00:14:25.855 "is_configured": true, 00:14:25.855 "data_offset": 2048, 00:14:25.855 "data_size": 63488 00:14:25.855 }, 00:14:25.855 { 00:14:25.855 "name": "BaseBdev3", 00:14:25.855 "uuid": "c19b6cff-a607-59f0-84f7-1fa86a4ca011", 00:14:25.855 "is_configured": true, 00:14:25.855 "data_offset": 2048, 00:14:25.855 "data_size": 63488 00:14:25.855 } 00:14:25.855 ] 00:14:25.855 }' 00:14:25.855 13:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:25.855 13:23:06 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:26.426 13:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:14:26.426 13:23:06 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:26.426 [2024-07-25 13:23:07.040826] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbcc6c0 00:14:27.366 13:23:07 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.366 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.626 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.626 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:27.626 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:27.626 "name": "raid_bdev1", 00:14:27.626 "uuid": "806c2109-b953-4041-b8e6-60f373d90ac0", 00:14:27.626 "strip_size_kb": 64, 00:14:27.626 "state": "online", 00:14:27.626 "raid_level": "raid0", 00:14:27.626 "superblock": true, 00:14:27.626 "num_base_bdevs": 3, 00:14:27.626 "num_base_bdevs_discovered": 3, 00:14:27.626 "num_base_bdevs_operational": 3, 00:14:27.626 "base_bdevs_list": [ 00:14:27.626 { 00:14:27.626 "name": "BaseBdev1", 00:14:27.626 "uuid": "05d9d547-c979-5556-8d0b-37372ce6bc4f", 00:14:27.626 "is_configured": true, 00:14:27.626 "data_offset": 2048, 00:14:27.626 "data_size": 63488 00:14:27.626 }, 00:14:27.626 { 00:14:27.626 "name": "BaseBdev2", 00:14:27.626 "uuid": "d9af33df-0422-573a-b99d-e4ddf0a4c2c7", 00:14:27.626 "is_configured": true, 00:14:27.626 "data_offset": 2048, 00:14:27.626 "data_size": 63488 00:14:27.626 }, 00:14:27.626 { 00:14:27.626 "name": "BaseBdev3", 00:14:27.626 "uuid": "c19b6cff-a607-59f0-84f7-1fa86a4ca011", 00:14:27.626 "is_configured": true, 00:14:27.626 "data_offset": 2048, 00:14:27.626 "data_size": 63488 00:14:27.626 } 00:14:27.626 ] 00:14:27.626 }' 00:14:27.626 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:27.626 13:23:08 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.195 13:23:08 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:28.455 [2024-07-25 13:23:09.081496] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:28.455 [2024-07-25 13:23:09.081529] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:28.455 [2024-07-25 13:23:09.084112] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:28.456 [2024-07-25 13:23:09.084137] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:28.456 [2024-07-25 13:23:09.084161] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:28.456 [2024-07-25 13:23:09.084167] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf8ccc0 name raid_bdev1, state offline 00:14:28.456 0 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 906745 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 906745 ']' 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 906745 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 906745 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 906745' 00:14:28.456 killing process with pid 906745 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 906745 00:14:28.456 [2024-07-25 13:23:09.168327] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:28.456 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 906745 00:14:28.456 [2024-07-25 13:23:09.179479] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.UgLNdxd62j 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.49 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.49 != \0\.\0\0 ]] 00:14:28.716 00:14:28.716 real 0m7.783s 00:14:28.716 user 0m13.487s 00:14:28.716 sys 0m1.019s 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:28.716 13:23:09 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.716 ************************************ 00:14:28.716 END TEST raid_read_error_test 00:14:28.716 ************************************ 00:14:28.716 13:23:09 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:14:28.716 13:23:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:28.716 13:23:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:28.716 13:23:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:28.716 ************************************ 00:14:28.716 START TEST raid_write_error_test 00:14:28.716 ************************************ 00:14:28.716 13:23:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 3 write 00:14:28.716 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:14:28.716 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:14:28.716 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:14:28.716 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:14:28.716 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:28.716 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.GfH82MtAZp 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=908091 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 908091 /var/tmp/spdk-raid.sock 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 908091 ']' 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:28.717 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:28.717 13:23:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.717 [2024-07-25 13:23:09.502846] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:14:28.717 [2024-07-25 13:23:09.502977] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid908091 ] 00:14:28.977 [2024-07-25 13:23:09.645729] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:28.977 [2024-07-25 13:23:09.722250] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:28.977 [2024-07-25 13:23:09.761610] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:28.977 [2024-07-25 13:23:09.761647] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:29.916 13:23:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:29.916 13:23:10 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:14:29.916 13:23:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:29.916 13:23:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:30.176 BaseBdev1_malloc 00:14:30.176 13:23:10 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:30.436 true 00:14:30.436 13:23:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:30.436 [2024-07-25 13:23:11.209367] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:30.437 [2024-07-25 13:23:11.209399] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.437 [2024-07-25 13:23:11.209411] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26942a0 00:14:30.437 [2024-07-25 13:23:11.209417] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.437 [2024-07-25 13:23:11.210722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.437 [2024-07-25 13:23:11.210742] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:30.437 BaseBdev1 00:14:30.437 13:23:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:30.437 13:23:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:30.697 BaseBdev2_malloc 00:14:30.957 13:23:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:30.957 true 00:14:30.957 13:23:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:31.217 [2024-07-25 13:23:11.925151] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:31.217 [2024-07-25 13:23:11.925183] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.217 [2024-07-25 13:23:11.925195] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2753420 00:14:31.217 [2024-07-25 13:23:11.925202] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.217 [2024-07-25 13:23:11.926384] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.217 [2024-07-25 13:23:11.926404] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:31.217 BaseBdev2 00:14:31.217 13:23:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:14:31.217 13:23:11 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:31.476 BaseBdev3_malloc 00:14:31.476 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:31.737 true 00:14:31.737 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:31.997 [2024-07-25 13:23:12.580730] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:31.997 [2024-07-25 13:23:12.580765] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:31.997 [2024-07-25 13:23:12.580779] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2754f70 00:14:31.997 [2024-07-25 13:23:12.580785] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:31.997 [2024-07-25 13:23:12.581972] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:31.997 [2024-07-25 13:23:12.581991] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:31.997 BaseBdev3 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:31.997 [2024-07-25 13:23:12.757196] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:31.997 [2024-07-25 13:23:12.758191] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:31.997 [2024-07-25 13:23:12.758246] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:31.997 [2024-07-25 13:23:12.758389] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2756cc0 00:14:31.997 [2024-07-25 13:23:12.758396] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:31.997 [2024-07-25 13:23:12.758541] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2757f70 00:14:31.997 [2024-07-25 13:23:12.758662] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2756cc0 00:14:31.997 [2024-07-25 13:23:12.758667] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2756cc0 00:14:31.997 [2024-07-25 13:23:12.758752] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:31.997 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:31.998 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:31.998 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:31.998 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:31.998 13:23:12 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:32.291 13:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:32.291 "name": "raid_bdev1", 00:14:32.291 "uuid": "fa3eafe9-ce79-491d-8e4d-f682b2532871", 00:14:32.291 "strip_size_kb": 64, 00:14:32.291 "state": "online", 00:14:32.291 "raid_level": "raid0", 00:14:32.291 "superblock": true, 00:14:32.291 "num_base_bdevs": 3, 00:14:32.291 "num_base_bdevs_discovered": 3, 00:14:32.291 "num_base_bdevs_operational": 3, 00:14:32.291 "base_bdevs_list": [ 00:14:32.291 { 00:14:32.291 "name": "BaseBdev1", 00:14:32.291 "uuid": "1146b379-00a3-51b3-9599-a647669c6d09", 00:14:32.291 "is_configured": true, 00:14:32.291 "data_offset": 2048, 00:14:32.291 "data_size": 63488 00:14:32.291 }, 00:14:32.291 { 00:14:32.291 "name": "BaseBdev2", 00:14:32.291 "uuid": "ce1fe85a-b015-51d6-be3f-bd060ed23a06", 00:14:32.291 "is_configured": true, 00:14:32.291 "data_offset": 2048, 00:14:32.291 "data_size": 63488 00:14:32.291 }, 00:14:32.291 { 00:14:32.291 "name": "BaseBdev3", 00:14:32.291 "uuid": "06577771-b036-5e51-8659-2727b818ba40", 00:14:32.291 "is_configured": true, 00:14:32.291 "data_offset": 2048, 00:14:32.291 "data_size": 63488 00:14:32.291 } 00:14:32.291 ] 00:14:32.291 }' 00:14:32.291 13:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:32.291 13:23:13 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.259 13:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:14:33.259 13:23:13 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:33.259 [2024-07-25 13:23:14.024662] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23966c0 00:14:34.199 13:23:14 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:34.460 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:34.720 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:34.720 "name": "raid_bdev1", 00:14:34.720 "uuid": "fa3eafe9-ce79-491d-8e4d-f682b2532871", 00:14:34.720 "strip_size_kb": 64, 00:14:34.720 "state": "online", 00:14:34.720 "raid_level": "raid0", 00:14:34.720 "superblock": true, 00:14:34.720 "num_base_bdevs": 3, 00:14:34.720 "num_base_bdevs_discovered": 3, 00:14:34.720 "num_base_bdevs_operational": 3, 00:14:34.720 "base_bdevs_list": [ 00:14:34.720 { 00:14:34.720 "name": "BaseBdev1", 00:14:34.720 "uuid": "1146b379-00a3-51b3-9599-a647669c6d09", 00:14:34.720 "is_configured": true, 00:14:34.720 "data_offset": 2048, 00:14:34.720 "data_size": 63488 00:14:34.720 }, 00:14:34.720 { 00:14:34.720 "name": "BaseBdev2", 00:14:34.720 "uuid": "ce1fe85a-b015-51d6-be3f-bd060ed23a06", 00:14:34.720 "is_configured": true, 00:14:34.720 "data_offset": 2048, 00:14:34.720 "data_size": 63488 00:14:34.720 }, 00:14:34.720 { 00:14:34.720 "name": "BaseBdev3", 00:14:34.720 "uuid": "06577771-b036-5e51-8659-2727b818ba40", 00:14:34.720 "is_configured": true, 00:14:34.720 "data_offset": 2048, 00:14:34.720 "data_size": 63488 00:14:34.720 } 00:14:34.720 ] 00:14:34.720 }' 00:14:34.720 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:34.720 13:23:15 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.291 13:23:15 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:35.291 [2024-07-25 13:23:16.076401] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:35.291 [2024-07-25 13:23:16.076428] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:35.291 [2024-07-25 13:23:16.079015] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:35.291 [2024-07-25 13:23:16.079040] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:35.291 [2024-07-25 13:23:16.079063] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:35.291 [2024-07-25 13:23:16.079075] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2756cc0 name raid_bdev1, state offline 00:14:35.291 0 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 908091 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 908091 ']' 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 908091 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 908091 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 908091' 00:14:35.551 killing process with pid 908091 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 908091 00:14:35.551 [2024-07-25 13:23:16.185801] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 908091 00:14:35.551 [2024-07-25 13:23:16.197032] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.GfH82MtAZp 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.49 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.49 != \0\.\0\0 ]] 00:14:35.551 00:14:35.551 real 0m6.942s 00:14:35.551 user 0m11.336s 00:14:35.551 sys 0m0.997s 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:14:35.551 13:23:16 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.551 ************************************ 00:14:35.551 END TEST raid_write_error_test 00:14:35.551 ************************************ 00:14:35.811 13:23:16 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:14:35.811 13:23:16 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:14:35.811 13:23:16 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:14:35.811 13:23:16 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:14:35.811 13:23:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:35.811 ************************************ 00:14:35.811 START TEST raid_state_function_test 00:14:35.811 ************************************ 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 false 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=909410 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 909410' 00:14:35.811 Process raid pid: 909410 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 909410 /var/tmp/spdk-raid.sock 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 909410 ']' 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:35.811 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:14:35.811 13:23:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:35.811 [2024-07-25 13:23:16.465029] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:14:35.811 [2024-07-25 13:23:16.465101] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:35.811 [2024-07-25 13:23:16.567849] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:36.070 [2024-07-25 13:23:16.634846] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:14:36.070 [2024-07-25 13:23:16.673436] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:36.071 [2024-07-25 13:23:16.673457] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:36.639 13:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:14:36.639 13:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:14:36.639 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:36.899 [2024-07-25 13:23:17.484490] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:36.899 [2024-07-25 13:23:17.484521] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:36.899 [2024-07-25 13:23:17.484527] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:36.899 [2024-07-25 13:23:17.484533] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:36.899 [2024-07-25 13:23:17.484538] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:36.899 [2024-07-25 13:23:17.484544] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:36.899 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:37.159 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.159 "name": "Existed_Raid", 00:14:37.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.159 "strip_size_kb": 64, 00:14:37.159 "state": "configuring", 00:14:37.159 "raid_level": "concat", 00:14:37.159 "superblock": false, 00:14:37.159 "num_base_bdevs": 3, 00:14:37.159 "num_base_bdevs_discovered": 0, 00:14:37.159 "num_base_bdevs_operational": 3, 00:14:37.159 "base_bdevs_list": [ 00:14:37.159 { 00:14:37.159 "name": "BaseBdev1", 00:14:37.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.159 "is_configured": false, 00:14:37.159 "data_offset": 0, 00:14:37.159 "data_size": 0 00:14:37.159 }, 00:14:37.159 { 00:14:37.159 "name": "BaseBdev2", 00:14:37.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.159 "is_configured": false, 00:14:37.159 "data_offset": 0, 00:14:37.159 "data_size": 0 00:14:37.159 }, 00:14:37.159 { 00:14:37.159 "name": "BaseBdev3", 00:14:37.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:37.159 "is_configured": false, 00:14:37.159 "data_offset": 0, 00:14:37.159 "data_size": 0 00:14:37.159 } 00:14:37.159 ] 00:14:37.159 }' 00:14:37.159 13:23:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.159 13:23:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.098 13:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:38.098 [2024-07-25 13:23:18.875871] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:38.098 [2024-07-25 13:23:18.875893] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169f6d0 name Existed_Raid, state configuring 00:14:38.358 13:23:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:38.358 [2024-07-25 13:23:19.072384] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:38.358 [2024-07-25 13:23:19.072402] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:38.358 [2024-07-25 13:23:19.072411] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:38.358 [2024-07-25 13:23:19.072417] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:38.358 [2024-07-25 13:23:19.072421] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:38.358 [2024-07-25 13:23:19.072427] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:38.358 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:38.618 [2024-07-25 13:23:19.267433] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:38.618 BaseBdev1 00:14:38.618 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:38.618 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:38.618 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:38.618 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:38.618 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:38.618 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:38.618 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:38.878 [ 00:14:38.878 { 00:14:38.878 "name": "BaseBdev1", 00:14:38.878 "aliases": [ 00:14:38.878 "545c9df6-dbfd-4695-b0d8-52f62017a529" 00:14:38.878 ], 00:14:38.878 "product_name": "Malloc disk", 00:14:38.878 "block_size": 512, 00:14:38.878 "num_blocks": 65536, 00:14:38.878 "uuid": "545c9df6-dbfd-4695-b0d8-52f62017a529", 00:14:38.878 "assigned_rate_limits": { 00:14:38.878 "rw_ios_per_sec": 0, 00:14:38.878 "rw_mbytes_per_sec": 0, 00:14:38.878 "r_mbytes_per_sec": 0, 00:14:38.878 "w_mbytes_per_sec": 0 00:14:38.878 }, 00:14:38.878 "claimed": true, 00:14:38.878 "claim_type": "exclusive_write", 00:14:38.878 "zoned": false, 00:14:38.878 "supported_io_types": { 00:14:38.878 "read": true, 00:14:38.878 "write": true, 00:14:38.878 "unmap": true, 00:14:38.878 "flush": true, 00:14:38.878 "reset": true, 00:14:38.878 "nvme_admin": false, 00:14:38.878 "nvme_io": false, 00:14:38.878 "nvme_io_md": false, 00:14:38.878 "write_zeroes": true, 00:14:38.878 "zcopy": true, 00:14:38.878 "get_zone_info": false, 00:14:38.878 "zone_management": false, 00:14:38.878 "zone_append": false, 00:14:38.878 "compare": false, 00:14:38.878 "compare_and_write": false, 00:14:38.878 "abort": true, 00:14:38.878 "seek_hole": false, 00:14:38.878 "seek_data": false, 00:14:38.878 "copy": true, 00:14:38.878 "nvme_iov_md": false 00:14:38.878 }, 00:14:38.878 "memory_domains": [ 00:14:38.878 { 00:14:38.878 "dma_device_id": "system", 00:14:38.878 "dma_device_type": 1 00:14:38.878 }, 00:14:38.878 { 00:14:38.878 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:38.878 "dma_device_type": 2 00:14:38.878 } 00:14:38.878 ], 00:14:38.878 "driver_specific": {} 00:14:38.878 } 00:14:38.878 ] 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:38.878 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:39.138 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.138 "name": "Existed_Raid", 00:14:39.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.138 "strip_size_kb": 64, 00:14:39.138 "state": "configuring", 00:14:39.138 "raid_level": "concat", 00:14:39.138 "superblock": false, 00:14:39.138 "num_base_bdevs": 3, 00:14:39.138 "num_base_bdevs_discovered": 1, 00:14:39.138 "num_base_bdevs_operational": 3, 00:14:39.138 "base_bdevs_list": [ 00:14:39.138 { 00:14:39.138 "name": "BaseBdev1", 00:14:39.138 "uuid": "545c9df6-dbfd-4695-b0d8-52f62017a529", 00:14:39.138 "is_configured": true, 00:14:39.138 "data_offset": 0, 00:14:39.138 "data_size": 65536 00:14:39.138 }, 00:14:39.138 { 00:14:39.138 "name": "BaseBdev2", 00:14:39.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.138 "is_configured": false, 00:14:39.138 "data_offset": 0, 00:14:39.138 "data_size": 0 00:14:39.138 }, 00:14:39.138 { 00:14:39.138 "name": "BaseBdev3", 00:14:39.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:39.138 "is_configured": false, 00:14:39.138 "data_offset": 0, 00:14:39.138 "data_size": 0 00:14:39.138 } 00:14:39.138 ] 00:14:39.138 }' 00:14:39.138 13:23:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.138 13:23:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.078 13:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:40.337 [2024-07-25 13:23:20.931649] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:40.337 [2024-07-25 13:23:20.931676] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169efa0 name Existed_Raid, state configuring 00:14:40.337 13:23:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:40.337 [2024-07-25 13:23:21.128173] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:40.598 [2024-07-25 13:23:21.129288] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:40.598 [2024-07-25 13:23:21.129312] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:40.598 [2024-07-25 13:23:21.129319] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:40.598 [2024-07-25 13:23:21.129325] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:40.598 "name": "Existed_Raid", 00:14:40.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.598 "strip_size_kb": 64, 00:14:40.598 "state": "configuring", 00:14:40.598 "raid_level": "concat", 00:14:40.598 "superblock": false, 00:14:40.598 "num_base_bdevs": 3, 00:14:40.598 "num_base_bdevs_discovered": 1, 00:14:40.598 "num_base_bdevs_operational": 3, 00:14:40.598 "base_bdevs_list": [ 00:14:40.598 { 00:14:40.598 "name": "BaseBdev1", 00:14:40.598 "uuid": "545c9df6-dbfd-4695-b0d8-52f62017a529", 00:14:40.598 "is_configured": true, 00:14:40.598 "data_offset": 0, 00:14:40.598 "data_size": 65536 00:14:40.598 }, 00:14:40.598 { 00:14:40.598 "name": "BaseBdev2", 00:14:40.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.598 "is_configured": false, 00:14:40.598 "data_offset": 0, 00:14:40.598 "data_size": 0 00:14:40.598 }, 00:14:40.598 { 00:14:40.598 "name": "BaseBdev3", 00:14:40.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:40.598 "is_configured": false, 00:14:40.598 "data_offset": 0, 00:14:40.598 "data_size": 0 00:14:40.598 } 00:14:40.598 ] 00:14:40.598 }' 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:40.598 13:23:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.169 13:23:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:41.429 [2024-07-25 13:23:22.019470] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:41.429 BaseBdev2 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:41.429 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:41.688 [ 00:14:41.688 { 00:14:41.688 "name": "BaseBdev2", 00:14:41.688 "aliases": [ 00:14:41.688 "64d0ca9b-705c-458c-ad34-f6e61a1b351c" 00:14:41.688 ], 00:14:41.688 "product_name": "Malloc disk", 00:14:41.688 "block_size": 512, 00:14:41.689 "num_blocks": 65536, 00:14:41.689 "uuid": "64d0ca9b-705c-458c-ad34-f6e61a1b351c", 00:14:41.689 "assigned_rate_limits": { 00:14:41.689 "rw_ios_per_sec": 0, 00:14:41.689 "rw_mbytes_per_sec": 0, 00:14:41.689 "r_mbytes_per_sec": 0, 00:14:41.689 "w_mbytes_per_sec": 0 00:14:41.689 }, 00:14:41.689 "claimed": true, 00:14:41.689 "claim_type": "exclusive_write", 00:14:41.689 "zoned": false, 00:14:41.689 "supported_io_types": { 00:14:41.689 "read": true, 00:14:41.689 "write": true, 00:14:41.689 "unmap": true, 00:14:41.689 "flush": true, 00:14:41.689 "reset": true, 00:14:41.689 "nvme_admin": false, 00:14:41.689 "nvme_io": false, 00:14:41.689 "nvme_io_md": false, 00:14:41.689 "write_zeroes": true, 00:14:41.689 "zcopy": true, 00:14:41.689 "get_zone_info": false, 00:14:41.689 "zone_management": false, 00:14:41.689 "zone_append": false, 00:14:41.689 "compare": false, 00:14:41.689 "compare_and_write": false, 00:14:41.689 "abort": true, 00:14:41.689 "seek_hole": false, 00:14:41.689 "seek_data": false, 00:14:41.689 "copy": true, 00:14:41.689 "nvme_iov_md": false 00:14:41.689 }, 00:14:41.689 "memory_domains": [ 00:14:41.689 { 00:14:41.689 "dma_device_id": "system", 00:14:41.689 "dma_device_type": 1 00:14:41.689 }, 00:14:41.689 { 00:14:41.689 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:41.689 "dma_device_type": 2 00:14:41.689 } 00:14:41.689 ], 00:14:41.689 "driver_specific": {} 00:14:41.689 } 00:14:41.689 ] 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:41.689 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:41.948 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:41.948 "name": "Existed_Raid", 00:14:41.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.948 "strip_size_kb": 64, 00:14:41.948 "state": "configuring", 00:14:41.948 "raid_level": "concat", 00:14:41.948 "superblock": false, 00:14:41.948 "num_base_bdevs": 3, 00:14:41.948 "num_base_bdevs_discovered": 2, 00:14:41.948 "num_base_bdevs_operational": 3, 00:14:41.948 "base_bdevs_list": [ 00:14:41.948 { 00:14:41.948 "name": "BaseBdev1", 00:14:41.948 "uuid": "545c9df6-dbfd-4695-b0d8-52f62017a529", 00:14:41.948 "is_configured": true, 00:14:41.948 "data_offset": 0, 00:14:41.948 "data_size": 65536 00:14:41.948 }, 00:14:41.948 { 00:14:41.948 "name": "BaseBdev2", 00:14:41.948 "uuid": "64d0ca9b-705c-458c-ad34-f6e61a1b351c", 00:14:41.948 "is_configured": true, 00:14:41.948 "data_offset": 0, 00:14:41.948 "data_size": 65536 00:14:41.948 }, 00:14:41.948 { 00:14:41.948 "name": "BaseBdev3", 00:14:41.948 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:41.948 "is_configured": false, 00:14:41.948 "data_offset": 0, 00:14:41.948 "data_size": 0 00:14:41.948 } 00:14:41.948 ] 00:14:41.948 }' 00:14:41.948 13:23:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:41.948 13:23:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:42.520 [2024-07-25 13:23:23.255671] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:42.520 [2024-07-25 13:23:23.255696] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x169fea0 00:14:42.520 [2024-07-25 13:23:23.255701] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:14:42.520 [2024-07-25 13:23:23.255876] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x169fb70 00:14:42.520 [2024-07-25 13:23:23.255967] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x169fea0 00:14:42.520 [2024-07-25 13:23:23.255973] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x169fea0 00:14:42.520 [2024-07-25 13:23:23.256092] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:42.520 BaseBdev3 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:42.520 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:42.780 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:43.040 [ 00:14:43.040 { 00:14:43.040 "name": "BaseBdev3", 00:14:43.040 "aliases": [ 00:14:43.040 "496ee2c3-95fe-4b4f-9973-7ee9a9fc78ef" 00:14:43.040 ], 00:14:43.040 "product_name": "Malloc disk", 00:14:43.040 "block_size": 512, 00:14:43.040 "num_blocks": 65536, 00:14:43.040 "uuid": "496ee2c3-95fe-4b4f-9973-7ee9a9fc78ef", 00:14:43.040 "assigned_rate_limits": { 00:14:43.040 "rw_ios_per_sec": 0, 00:14:43.040 "rw_mbytes_per_sec": 0, 00:14:43.040 "r_mbytes_per_sec": 0, 00:14:43.040 "w_mbytes_per_sec": 0 00:14:43.040 }, 00:14:43.040 "claimed": true, 00:14:43.040 "claim_type": "exclusive_write", 00:14:43.040 "zoned": false, 00:14:43.040 "supported_io_types": { 00:14:43.040 "read": true, 00:14:43.040 "write": true, 00:14:43.040 "unmap": true, 00:14:43.040 "flush": true, 00:14:43.040 "reset": true, 00:14:43.040 "nvme_admin": false, 00:14:43.040 "nvme_io": false, 00:14:43.040 "nvme_io_md": false, 00:14:43.040 "write_zeroes": true, 00:14:43.040 "zcopy": true, 00:14:43.040 "get_zone_info": false, 00:14:43.040 "zone_management": false, 00:14:43.040 "zone_append": false, 00:14:43.040 "compare": false, 00:14:43.040 "compare_and_write": false, 00:14:43.040 "abort": true, 00:14:43.040 "seek_hole": false, 00:14:43.040 "seek_data": false, 00:14:43.040 "copy": true, 00:14:43.040 "nvme_iov_md": false 00:14:43.040 }, 00:14:43.040 "memory_domains": [ 00:14:43.040 { 00:14:43.040 "dma_device_id": "system", 00:14:43.040 "dma_device_type": 1 00:14:43.040 }, 00:14:43.040 { 00:14:43.040 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.040 "dma_device_type": 2 00:14:43.040 } 00:14:43.040 ], 00:14:43.040 "driver_specific": {} 00:14:43.040 } 00:14:43.040 ] 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:43.040 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:43.300 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:43.300 "name": "Existed_Raid", 00:14:43.300 "uuid": "d534faf3-4382-46ca-b369-d9bdb557dde8", 00:14:43.300 "strip_size_kb": 64, 00:14:43.300 "state": "online", 00:14:43.300 "raid_level": "concat", 00:14:43.300 "superblock": false, 00:14:43.300 "num_base_bdevs": 3, 00:14:43.300 "num_base_bdevs_discovered": 3, 00:14:43.300 "num_base_bdevs_operational": 3, 00:14:43.300 "base_bdevs_list": [ 00:14:43.300 { 00:14:43.300 "name": "BaseBdev1", 00:14:43.300 "uuid": "545c9df6-dbfd-4695-b0d8-52f62017a529", 00:14:43.300 "is_configured": true, 00:14:43.300 "data_offset": 0, 00:14:43.300 "data_size": 65536 00:14:43.300 }, 00:14:43.300 { 00:14:43.300 "name": "BaseBdev2", 00:14:43.300 "uuid": "64d0ca9b-705c-458c-ad34-f6e61a1b351c", 00:14:43.300 "is_configured": true, 00:14:43.300 "data_offset": 0, 00:14:43.300 "data_size": 65536 00:14:43.300 }, 00:14:43.300 { 00:14:43.300 "name": "BaseBdev3", 00:14:43.300 "uuid": "496ee2c3-95fe-4b4f-9973-7ee9a9fc78ef", 00:14:43.300 "is_configured": true, 00:14:43.300 "data_offset": 0, 00:14:43.300 "data_size": 65536 00:14:43.300 } 00:14:43.300 ] 00:14:43.300 }' 00:14:43.300 13:23:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:43.300 13:23:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:43.868 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:43.868 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:43.868 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:43.868 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:43.868 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:43.869 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:43.869 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:43.869 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:43.869 [2024-07-25 13:23:24.599305] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:43.869 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:43.869 "name": "Existed_Raid", 00:14:43.869 "aliases": [ 00:14:43.869 "d534faf3-4382-46ca-b369-d9bdb557dde8" 00:14:43.869 ], 00:14:43.869 "product_name": "Raid Volume", 00:14:43.869 "block_size": 512, 00:14:43.869 "num_blocks": 196608, 00:14:43.869 "uuid": "d534faf3-4382-46ca-b369-d9bdb557dde8", 00:14:43.869 "assigned_rate_limits": { 00:14:43.869 "rw_ios_per_sec": 0, 00:14:43.869 "rw_mbytes_per_sec": 0, 00:14:43.869 "r_mbytes_per_sec": 0, 00:14:43.869 "w_mbytes_per_sec": 0 00:14:43.869 }, 00:14:43.869 "claimed": false, 00:14:43.869 "zoned": false, 00:14:43.869 "supported_io_types": { 00:14:43.869 "read": true, 00:14:43.869 "write": true, 00:14:43.869 "unmap": true, 00:14:43.869 "flush": true, 00:14:43.869 "reset": true, 00:14:43.869 "nvme_admin": false, 00:14:43.869 "nvme_io": false, 00:14:43.869 "nvme_io_md": false, 00:14:43.869 "write_zeroes": true, 00:14:43.869 "zcopy": false, 00:14:43.869 "get_zone_info": false, 00:14:43.869 "zone_management": false, 00:14:43.869 "zone_append": false, 00:14:43.869 "compare": false, 00:14:43.869 "compare_and_write": false, 00:14:43.869 "abort": false, 00:14:43.869 "seek_hole": false, 00:14:43.869 "seek_data": false, 00:14:43.869 "copy": false, 00:14:43.869 "nvme_iov_md": false 00:14:43.869 }, 00:14:43.869 "memory_domains": [ 00:14:43.869 { 00:14:43.869 "dma_device_id": "system", 00:14:43.869 "dma_device_type": 1 00:14:43.869 }, 00:14:43.869 { 00:14:43.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.869 "dma_device_type": 2 00:14:43.869 }, 00:14:43.869 { 00:14:43.869 "dma_device_id": "system", 00:14:43.869 "dma_device_type": 1 00:14:43.869 }, 00:14:43.869 { 00:14:43.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.869 "dma_device_type": 2 00:14:43.869 }, 00:14:43.869 { 00:14:43.869 "dma_device_id": "system", 00:14:43.869 "dma_device_type": 1 00:14:43.869 }, 00:14:43.869 { 00:14:43.869 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:43.869 "dma_device_type": 2 00:14:43.869 } 00:14:43.869 ], 00:14:43.869 "driver_specific": { 00:14:43.869 "raid": { 00:14:43.869 "uuid": "d534faf3-4382-46ca-b369-d9bdb557dde8", 00:14:43.869 "strip_size_kb": 64, 00:14:43.869 "state": "online", 00:14:43.869 "raid_level": "concat", 00:14:43.869 "superblock": false, 00:14:43.869 "num_base_bdevs": 3, 00:14:43.869 "num_base_bdevs_discovered": 3, 00:14:43.869 "num_base_bdevs_operational": 3, 00:14:43.869 "base_bdevs_list": [ 00:14:43.869 { 00:14:43.869 "name": "BaseBdev1", 00:14:43.869 "uuid": "545c9df6-dbfd-4695-b0d8-52f62017a529", 00:14:43.869 "is_configured": true, 00:14:43.869 "data_offset": 0, 00:14:43.869 "data_size": 65536 00:14:43.869 }, 00:14:43.869 { 00:14:43.869 "name": "BaseBdev2", 00:14:43.869 "uuid": "64d0ca9b-705c-458c-ad34-f6e61a1b351c", 00:14:43.869 "is_configured": true, 00:14:43.869 "data_offset": 0, 00:14:43.869 "data_size": 65536 00:14:43.869 }, 00:14:43.869 { 00:14:43.869 "name": "BaseBdev3", 00:14:43.869 "uuid": "496ee2c3-95fe-4b4f-9973-7ee9a9fc78ef", 00:14:43.869 "is_configured": true, 00:14:43.869 "data_offset": 0, 00:14:43.869 "data_size": 65536 00:14:43.869 } 00:14:43.869 ] 00:14:43.869 } 00:14:43.869 } 00:14:43.869 }' 00:14:43.869 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:44.129 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:44.129 BaseBdev2 00:14:44.129 BaseBdev3' 00:14:44.129 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.129 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:44.129 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:44.129 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:44.129 "name": "BaseBdev1", 00:14:44.129 "aliases": [ 00:14:44.129 "545c9df6-dbfd-4695-b0d8-52f62017a529" 00:14:44.129 ], 00:14:44.129 "product_name": "Malloc disk", 00:14:44.129 "block_size": 512, 00:14:44.129 "num_blocks": 65536, 00:14:44.129 "uuid": "545c9df6-dbfd-4695-b0d8-52f62017a529", 00:14:44.129 "assigned_rate_limits": { 00:14:44.129 "rw_ios_per_sec": 0, 00:14:44.129 "rw_mbytes_per_sec": 0, 00:14:44.129 "r_mbytes_per_sec": 0, 00:14:44.129 "w_mbytes_per_sec": 0 00:14:44.129 }, 00:14:44.129 "claimed": true, 00:14:44.129 "claim_type": "exclusive_write", 00:14:44.129 "zoned": false, 00:14:44.129 "supported_io_types": { 00:14:44.129 "read": true, 00:14:44.129 "write": true, 00:14:44.129 "unmap": true, 00:14:44.129 "flush": true, 00:14:44.129 "reset": true, 00:14:44.129 "nvme_admin": false, 00:14:44.129 "nvme_io": false, 00:14:44.129 "nvme_io_md": false, 00:14:44.129 "write_zeroes": true, 00:14:44.129 "zcopy": true, 00:14:44.129 "get_zone_info": false, 00:14:44.129 "zone_management": false, 00:14:44.129 "zone_append": false, 00:14:44.129 "compare": false, 00:14:44.129 "compare_and_write": false, 00:14:44.129 "abort": true, 00:14:44.129 "seek_hole": false, 00:14:44.129 "seek_data": false, 00:14:44.129 "copy": true, 00:14:44.129 "nvme_iov_md": false 00:14:44.129 }, 00:14:44.129 "memory_domains": [ 00:14:44.129 { 00:14:44.129 "dma_device_id": "system", 00:14:44.129 "dma_device_type": 1 00:14:44.129 }, 00:14:44.129 { 00:14:44.129 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:44.129 "dma_device_type": 2 00:14:44.129 } 00:14:44.129 ], 00:14:44.129 "driver_specific": {} 00:14:44.129 }' 00:14:44.129 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.129 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:44.389 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:44.389 13:23:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.389 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:44.389 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:44.389 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:44.650 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:45.219 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:45.219 "name": "BaseBdev2", 00:14:45.219 "aliases": [ 00:14:45.219 "64d0ca9b-705c-458c-ad34-f6e61a1b351c" 00:14:45.219 ], 00:14:45.219 "product_name": "Malloc disk", 00:14:45.219 "block_size": 512, 00:14:45.219 "num_blocks": 65536, 00:14:45.219 "uuid": "64d0ca9b-705c-458c-ad34-f6e61a1b351c", 00:14:45.219 "assigned_rate_limits": { 00:14:45.219 "rw_ios_per_sec": 0, 00:14:45.219 "rw_mbytes_per_sec": 0, 00:14:45.219 "r_mbytes_per_sec": 0, 00:14:45.219 "w_mbytes_per_sec": 0 00:14:45.219 }, 00:14:45.219 "claimed": true, 00:14:45.219 "claim_type": "exclusive_write", 00:14:45.219 "zoned": false, 00:14:45.219 "supported_io_types": { 00:14:45.219 "read": true, 00:14:45.219 "write": true, 00:14:45.219 "unmap": true, 00:14:45.219 "flush": true, 00:14:45.219 "reset": true, 00:14:45.219 "nvme_admin": false, 00:14:45.219 "nvme_io": false, 00:14:45.219 "nvme_io_md": false, 00:14:45.219 "write_zeroes": true, 00:14:45.219 "zcopy": true, 00:14:45.219 "get_zone_info": false, 00:14:45.219 "zone_management": false, 00:14:45.219 "zone_append": false, 00:14:45.219 "compare": false, 00:14:45.219 "compare_and_write": false, 00:14:45.219 "abort": true, 00:14:45.219 "seek_hole": false, 00:14:45.219 "seek_data": false, 00:14:45.219 "copy": true, 00:14:45.219 "nvme_iov_md": false 00:14:45.219 }, 00:14:45.219 "memory_domains": [ 00:14:45.219 { 00:14:45.219 "dma_device_id": "system", 00:14:45.219 "dma_device_type": 1 00:14:45.219 }, 00:14:45.219 { 00:14:45.219 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:45.219 "dma_device_type": 2 00:14:45.219 } 00:14:45.219 ], 00:14:45.219 "driver_specific": {} 00:14:45.219 }' 00:14:45.219 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.219 13:23:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:45.479 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:45.479 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.479 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:45.479 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:45.479 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:45.738 13:23:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:46.308 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:46.308 "name": "BaseBdev3", 00:14:46.308 "aliases": [ 00:14:46.308 "496ee2c3-95fe-4b4f-9973-7ee9a9fc78ef" 00:14:46.308 ], 00:14:46.308 "product_name": "Malloc disk", 00:14:46.308 "block_size": 512, 00:14:46.308 "num_blocks": 65536, 00:14:46.308 "uuid": "496ee2c3-95fe-4b4f-9973-7ee9a9fc78ef", 00:14:46.308 "assigned_rate_limits": { 00:14:46.308 "rw_ios_per_sec": 0, 00:14:46.308 "rw_mbytes_per_sec": 0, 00:14:46.308 "r_mbytes_per_sec": 0, 00:14:46.308 "w_mbytes_per_sec": 0 00:14:46.308 }, 00:14:46.308 "claimed": true, 00:14:46.308 "claim_type": "exclusive_write", 00:14:46.308 "zoned": false, 00:14:46.308 "supported_io_types": { 00:14:46.308 "read": true, 00:14:46.308 "write": true, 00:14:46.308 "unmap": true, 00:14:46.308 "flush": true, 00:14:46.308 "reset": true, 00:14:46.308 "nvme_admin": false, 00:14:46.308 "nvme_io": false, 00:14:46.308 "nvme_io_md": false, 00:14:46.308 "write_zeroes": true, 00:14:46.308 "zcopy": true, 00:14:46.308 "get_zone_info": false, 00:14:46.308 "zone_management": false, 00:14:46.308 "zone_append": false, 00:14:46.308 "compare": false, 00:14:46.308 "compare_and_write": false, 00:14:46.308 "abort": true, 00:14:46.308 "seek_hole": false, 00:14:46.308 "seek_data": false, 00:14:46.308 "copy": true, 00:14:46.308 "nvme_iov_md": false 00:14:46.308 }, 00:14:46.308 "memory_domains": [ 00:14:46.308 { 00:14:46.308 "dma_device_id": "system", 00:14:46.308 "dma_device_type": 1 00:14:46.308 }, 00:14:46.308 { 00:14:46.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:46.308 "dma_device_type": 2 00:14:46.308 } 00:14:46.308 ], 00:14:46.308 "driver_specific": {} 00:14:46.308 }' 00:14:46.308 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.308 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:46.308 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:46.308 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.568 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:46.568 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:46.568 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.568 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:46.568 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:46.568 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.568 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:46.827 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:46.827 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:47.397 [2024-07-25 13:23:27.891461] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:47.397 [2024-07-25 13:23:27.891484] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:47.397 [2024-07-25 13:23:27.891515] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:47.397 13:23:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:47.397 13:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:47.397 "name": "Existed_Raid", 00:14:47.397 "uuid": "d534faf3-4382-46ca-b369-d9bdb557dde8", 00:14:47.397 "strip_size_kb": 64, 00:14:47.397 "state": "offline", 00:14:47.397 "raid_level": "concat", 00:14:47.397 "superblock": false, 00:14:47.397 "num_base_bdevs": 3, 00:14:47.397 "num_base_bdevs_discovered": 2, 00:14:47.397 "num_base_bdevs_operational": 2, 00:14:47.397 "base_bdevs_list": [ 00:14:47.397 { 00:14:47.397 "name": null, 00:14:47.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:47.397 "is_configured": false, 00:14:47.397 "data_offset": 0, 00:14:47.397 "data_size": 65536 00:14:47.397 }, 00:14:47.397 { 00:14:47.397 "name": "BaseBdev2", 00:14:47.397 "uuid": "64d0ca9b-705c-458c-ad34-f6e61a1b351c", 00:14:47.397 "is_configured": true, 00:14:47.397 "data_offset": 0, 00:14:47.397 "data_size": 65536 00:14:47.397 }, 00:14:47.397 { 00:14:47.397 "name": "BaseBdev3", 00:14:47.397 "uuid": "496ee2c3-95fe-4b4f-9973-7ee9a9fc78ef", 00:14:47.397 "is_configured": true, 00:14:47.397 "data_offset": 0, 00:14:47.397 "data_size": 65536 00:14:47.397 } 00:14:47.397 ] 00:14:47.397 }' 00:14:47.397 13:23:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:47.397 13:23:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.337 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:48.337 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:48.337 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.337 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:48.597 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:48.597 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:48.597 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:48.857 [2024-07-25 13:23:29.467459] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:48.857 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:48.857 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:48.857 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:48.857 13:23:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:49.428 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:49.428 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:49.428 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:49.999 [2024-07-25 13:23:30.548051] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:49.999 [2024-07-25 13:23:30.548084] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169fea0 name Existed_Raid, state offline 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:49.999 13:23:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:50.568 BaseBdev2 00:14:50.569 13:23:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:50.569 13:23:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:14:50.569 13:23:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:50.569 13:23:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:50.569 13:23:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:50.569 13:23:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:50.569 13:23:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:51.137 13:23:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:51.707 [ 00:14:51.707 { 00:14:51.707 "name": "BaseBdev2", 00:14:51.707 "aliases": [ 00:14:51.707 "cc1b58d8-3814-4725-a360-0df52150ec2b" 00:14:51.707 ], 00:14:51.707 "product_name": "Malloc disk", 00:14:51.707 "block_size": 512, 00:14:51.707 "num_blocks": 65536, 00:14:51.707 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:14:51.707 "assigned_rate_limits": { 00:14:51.707 "rw_ios_per_sec": 0, 00:14:51.707 "rw_mbytes_per_sec": 0, 00:14:51.707 "r_mbytes_per_sec": 0, 00:14:51.707 "w_mbytes_per_sec": 0 00:14:51.707 }, 00:14:51.707 "claimed": false, 00:14:51.707 "zoned": false, 00:14:51.707 "supported_io_types": { 00:14:51.707 "read": true, 00:14:51.707 "write": true, 00:14:51.707 "unmap": true, 00:14:51.707 "flush": true, 00:14:51.707 "reset": true, 00:14:51.707 "nvme_admin": false, 00:14:51.707 "nvme_io": false, 00:14:51.707 "nvme_io_md": false, 00:14:51.707 "write_zeroes": true, 00:14:51.707 "zcopy": true, 00:14:51.707 "get_zone_info": false, 00:14:51.707 "zone_management": false, 00:14:51.707 "zone_append": false, 00:14:51.707 "compare": false, 00:14:51.707 "compare_and_write": false, 00:14:51.707 "abort": true, 00:14:51.707 "seek_hole": false, 00:14:51.707 "seek_data": false, 00:14:51.707 "copy": true, 00:14:51.707 "nvme_iov_md": false 00:14:51.707 }, 00:14:51.707 "memory_domains": [ 00:14:51.707 { 00:14:51.707 "dma_device_id": "system", 00:14:51.707 "dma_device_type": 1 00:14:51.707 }, 00:14:51.707 { 00:14:51.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.707 "dma_device_type": 2 00:14:51.707 } 00:14:51.707 ], 00:14:51.707 "driver_specific": {} 00:14:51.707 } 00:14:51.707 ] 00:14:51.707 13:23:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:51.707 13:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:51.707 13:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:51.707 13:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:52.277 BaseBdev3 00:14:52.277 13:23:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:52.277 13:23:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:14:52.277 13:23:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:52.277 13:23:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:52.277 13:23:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:52.277 13:23:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:52.277 13:23:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:52.537 13:23:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:52.537 [ 00:14:52.537 { 00:14:52.537 "name": "BaseBdev3", 00:14:52.537 "aliases": [ 00:14:52.537 "5637e11d-9905-4e2f-b5ed-a0ec06bb9127" 00:14:52.537 ], 00:14:52.538 "product_name": "Malloc disk", 00:14:52.538 "block_size": 512, 00:14:52.538 "num_blocks": 65536, 00:14:52.538 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:14:52.538 "assigned_rate_limits": { 00:14:52.538 "rw_ios_per_sec": 0, 00:14:52.538 "rw_mbytes_per_sec": 0, 00:14:52.538 "r_mbytes_per_sec": 0, 00:14:52.538 "w_mbytes_per_sec": 0 00:14:52.538 }, 00:14:52.538 "claimed": false, 00:14:52.538 "zoned": false, 00:14:52.538 "supported_io_types": { 00:14:52.538 "read": true, 00:14:52.538 "write": true, 00:14:52.538 "unmap": true, 00:14:52.538 "flush": true, 00:14:52.538 "reset": true, 00:14:52.538 "nvme_admin": false, 00:14:52.538 "nvme_io": false, 00:14:52.538 "nvme_io_md": false, 00:14:52.538 "write_zeroes": true, 00:14:52.538 "zcopy": true, 00:14:52.538 "get_zone_info": false, 00:14:52.538 "zone_management": false, 00:14:52.538 "zone_append": false, 00:14:52.538 "compare": false, 00:14:52.538 "compare_and_write": false, 00:14:52.538 "abort": true, 00:14:52.538 "seek_hole": false, 00:14:52.538 "seek_data": false, 00:14:52.538 "copy": true, 00:14:52.538 "nvme_iov_md": false 00:14:52.538 }, 00:14:52.538 "memory_domains": [ 00:14:52.538 { 00:14:52.538 "dma_device_id": "system", 00:14:52.538 "dma_device_type": 1 00:14:52.538 }, 00:14:52.538 { 00:14:52.538 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:52.538 "dma_device_type": 2 00:14:52.538 } 00:14:52.538 ], 00:14:52.538 "driver_specific": {} 00:14:52.538 } 00:14:52.538 ] 00:14:52.797 13:23:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:52.797 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:52.797 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:52.797 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:53.056 [2024-07-25 13:23:33.599486] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:53.056 [2024-07-25 13:23:33.599515] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:53.057 [2024-07-25 13:23:33.599528] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:53.057 [2024-07-25 13:23:33.600565] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.057 13:23:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.627 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.627 "name": "Existed_Raid", 00:14:53.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.627 "strip_size_kb": 64, 00:14:53.627 "state": "configuring", 00:14:53.627 "raid_level": "concat", 00:14:53.627 "superblock": false, 00:14:53.627 "num_base_bdevs": 3, 00:14:53.627 "num_base_bdevs_discovered": 2, 00:14:53.627 "num_base_bdevs_operational": 3, 00:14:53.627 "base_bdevs_list": [ 00:14:53.627 { 00:14:53.627 "name": "BaseBdev1", 00:14:53.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.627 "is_configured": false, 00:14:53.627 "data_offset": 0, 00:14:53.627 "data_size": 0 00:14:53.627 }, 00:14:53.627 { 00:14:53.627 "name": "BaseBdev2", 00:14:53.627 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:14:53.627 "is_configured": true, 00:14:53.627 "data_offset": 0, 00:14:53.627 "data_size": 65536 00:14:53.627 }, 00:14:53.627 { 00:14:53.627 "name": "BaseBdev3", 00:14:53.627 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:14:53.627 "is_configured": true, 00:14:53.627 "data_offset": 0, 00:14:53.627 "data_size": 65536 00:14:53.627 } 00:14:53.627 ] 00:14:53.627 }' 00:14:53.627 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.627 13:23:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:54.197 [2024-07-25 13:23:34.922821] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.197 13:23:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.457 13:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.457 "name": "Existed_Raid", 00:14:54.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.457 "strip_size_kb": 64, 00:14:54.457 "state": "configuring", 00:14:54.457 "raid_level": "concat", 00:14:54.457 "superblock": false, 00:14:54.457 "num_base_bdevs": 3, 00:14:54.457 "num_base_bdevs_discovered": 1, 00:14:54.457 "num_base_bdevs_operational": 3, 00:14:54.457 "base_bdevs_list": [ 00:14:54.457 { 00:14:54.457 "name": "BaseBdev1", 00:14:54.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.457 "is_configured": false, 00:14:54.457 "data_offset": 0, 00:14:54.457 "data_size": 0 00:14:54.457 }, 00:14:54.457 { 00:14:54.457 "name": null, 00:14:54.457 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:14:54.457 "is_configured": false, 00:14:54.457 "data_offset": 0, 00:14:54.457 "data_size": 65536 00:14:54.457 }, 00:14:54.457 { 00:14:54.457 "name": "BaseBdev3", 00:14:54.457 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:14:54.457 "is_configured": true, 00:14:54.457 "data_offset": 0, 00:14:54.457 "data_size": 65536 00:14:54.457 } 00:14:54.457 ] 00:14:54.457 }' 00:14:54.457 13:23:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.457 13:23:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.397 13:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:55.397 13:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:55.657 13:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:55.657 13:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:56.227 [2024-07-25 13:23:36.756431] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:56.227 BaseBdev1 00:14:56.227 13:23:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:56.227 13:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:14:56.227 13:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:14:56.227 13:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:14:56.227 13:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:14:56.227 13:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:14:56.227 13:23:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:56.797 13:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:57.057 [ 00:14:57.057 { 00:14:57.057 "name": "BaseBdev1", 00:14:57.057 "aliases": [ 00:14:57.057 "31bcdfff-c226-47ed-909d-134ce5b6e4c5" 00:14:57.057 ], 00:14:57.057 "product_name": "Malloc disk", 00:14:57.057 "block_size": 512, 00:14:57.057 "num_blocks": 65536, 00:14:57.057 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:14:57.057 "assigned_rate_limits": { 00:14:57.057 "rw_ios_per_sec": 0, 00:14:57.057 "rw_mbytes_per_sec": 0, 00:14:57.057 "r_mbytes_per_sec": 0, 00:14:57.057 "w_mbytes_per_sec": 0 00:14:57.057 }, 00:14:57.057 "claimed": true, 00:14:57.057 "claim_type": "exclusive_write", 00:14:57.057 "zoned": false, 00:14:57.057 "supported_io_types": { 00:14:57.057 "read": true, 00:14:57.057 "write": true, 00:14:57.057 "unmap": true, 00:14:57.057 "flush": true, 00:14:57.057 "reset": true, 00:14:57.057 "nvme_admin": false, 00:14:57.057 "nvme_io": false, 00:14:57.057 "nvme_io_md": false, 00:14:57.057 "write_zeroes": true, 00:14:57.057 "zcopy": true, 00:14:57.057 "get_zone_info": false, 00:14:57.057 "zone_management": false, 00:14:57.057 "zone_append": false, 00:14:57.057 "compare": false, 00:14:57.057 "compare_and_write": false, 00:14:57.057 "abort": true, 00:14:57.057 "seek_hole": false, 00:14:57.057 "seek_data": false, 00:14:57.057 "copy": true, 00:14:57.057 "nvme_iov_md": false 00:14:57.057 }, 00:14:57.057 "memory_domains": [ 00:14:57.057 { 00:14:57.057 "dma_device_id": "system", 00:14:57.057 "dma_device_type": 1 00:14:57.057 }, 00:14:57.057 { 00:14:57.057 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.057 "dma_device_type": 2 00:14:57.057 } 00:14:57.057 ], 00:14:57.057 "driver_specific": {} 00:14:57.058 } 00:14:57.058 ] 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.326 13:23:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:57.326 13:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:57.326 "name": "Existed_Raid", 00:14:57.326 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:57.326 "strip_size_kb": 64, 00:14:57.326 "state": "configuring", 00:14:57.326 "raid_level": "concat", 00:14:57.326 "superblock": false, 00:14:57.326 "num_base_bdevs": 3, 00:14:57.326 "num_base_bdevs_discovered": 2, 00:14:57.326 "num_base_bdevs_operational": 3, 00:14:57.326 "base_bdevs_list": [ 00:14:57.326 { 00:14:57.326 "name": "BaseBdev1", 00:14:57.326 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:14:57.326 "is_configured": true, 00:14:57.326 "data_offset": 0, 00:14:57.326 "data_size": 65536 00:14:57.326 }, 00:14:57.326 { 00:14:57.326 "name": null, 00:14:57.326 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:14:57.326 "is_configured": false, 00:14:57.326 "data_offset": 0, 00:14:57.326 "data_size": 65536 00:14:57.326 }, 00:14:57.326 { 00:14:57.326 "name": "BaseBdev3", 00:14:57.326 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:14:57.326 "is_configured": true, 00:14:57.326 "data_offset": 0, 00:14:57.326 "data_size": 65536 00:14:57.326 } 00:14:57.326 ] 00:14:57.326 }' 00:14:57.326 13:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:57.326 13:23:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:57.973 13:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:57.973 13:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:58.234 13:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:58.234 13:23:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:58.804 [2024-07-25 13:23:39.326963] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:58.804 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.374 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.374 "name": "Existed_Raid", 00:14:59.374 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.374 "strip_size_kb": 64, 00:14:59.374 "state": "configuring", 00:14:59.374 "raid_level": "concat", 00:14:59.374 "superblock": false, 00:14:59.374 "num_base_bdevs": 3, 00:14:59.374 "num_base_bdevs_discovered": 1, 00:14:59.374 "num_base_bdevs_operational": 3, 00:14:59.374 "base_bdevs_list": [ 00:14:59.374 { 00:14:59.374 "name": "BaseBdev1", 00:14:59.374 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:14:59.374 "is_configured": true, 00:14:59.374 "data_offset": 0, 00:14:59.374 "data_size": 65536 00:14:59.374 }, 00:14:59.374 { 00:14:59.374 "name": null, 00:14:59.374 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:14:59.374 "is_configured": false, 00:14:59.374 "data_offset": 0, 00:14:59.374 "data_size": 65536 00:14:59.374 }, 00:14:59.374 { 00:14:59.374 "name": null, 00:14:59.374 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:14:59.374 "is_configured": false, 00:14:59.374 "data_offset": 0, 00:14:59.374 "data_size": 65536 00:14:59.374 } 00:14:59.374 ] 00:14:59.374 }' 00:14:59.374 13:23:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.374 13:23:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:59.943 13:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.943 13:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:59.943 13:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:59.943 13:23:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:00.513 [2024-07-25 13:23:41.171785] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:00.513 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.514 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:00.774 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:00.774 "name": "Existed_Raid", 00:15:00.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:00.774 "strip_size_kb": 64, 00:15:00.774 "state": "configuring", 00:15:00.774 "raid_level": "concat", 00:15:00.774 "superblock": false, 00:15:00.774 "num_base_bdevs": 3, 00:15:00.774 "num_base_bdevs_discovered": 2, 00:15:00.774 "num_base_bdevs_operational": 3, 00:15:00.774 "base_bdevs_list": [ 00:15:00.774 { 00:15:00.774 "name": "BaseBdev1", 00:15:00.774 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:15:00.774 "is_configured": true, 00:15:00.774 "data_offset": 0, 00:15:00.774 "data_size": 65536 00:15:00.774 }, 00:15:00.774 { 00:15:00.774 "name": null, 00:15:00.774 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:15:00.774 "is_configured": false, 00:15:00.774 "data_offset": 0, 00:15:00.774 "data_size": 65536 00:15:00.774 }, 00:15:00.774 { 00:15:00.774 "name": "BaseBdev3", 00:15:00.774 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:15:00.774 "is_configured": true, 00:15:00.774 "data_offset": 0, 00:15:00.774 "data_size": 65536 00:15:00.774 } 00:15:00.774 ] 00:15:00.774 }' 00:15:00.774 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:00.774 13:23:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:01.344 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.344 13:23:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:01.603 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:01.603 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:02.174 [2024-07-25 13:23:42.667589] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:02.175 "name": "Existed_Raid", 00:15:02.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:02.175 "strip_size_kb": 64, 00:15:02.175 "state": "configuring", 00:15:02.175 "raid_level": "concat", 00:15:02.175 "superblock": false, 00:15:02.175 "num_base_bdevs": 3, 00:15:02.175 "num_base_bdevs_discovered": 1, 00:15:02.175 "num_base_bdevs_operational": 3, 00:15:02.175 "base_bdevs_list": [ 00:15:02.175 { 00:15:02.175 "name": null, 00:15:02.175 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:15:02.175 "is_configured": false, 00:15:02.175 "data_offset": 0, 00:15:02.175 "data_size": 65536 00:15:02.175 }, 00:15:02.175 { 00:15:02.175 "name": null, 00:15:02.175 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:15:02.175 "is_configured": false, 00:15:02.175 "data_offset": 0, 00:15:02.175 "data_size": 65536 00:15:02.175 }, 00:15:02.175 { 00:15:02.175 "name": "BaseBdev3", 00:15:02.175 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:15:02.175 "is_configured": true, 00:15:02.175 "data_offset": 0, 00:15:02.175 "data_size": 65536 00:15:02.175 } 00:15:02.175 ] 00:15:02.175 }' 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:02.175 13:23:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:02.744 13:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:02.744 13:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:03.003 13:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:03.003 13:23:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:03.571 [2024-07-25 13:23:44.117015] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.572 "name": "Existed_Raid", 00:15:03.572 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.572 "strip_size_kb": 64, 00:15:03.572 "state": "configuring", 00:15:03.572 "raid_level": "concat", 00:15:03.572 "superblock": false, 00:15:03.572 "num_base_bdevs": 3, 00:15:03.572 "num_base_bdevs_discovered": 2, 00:15:03.572 "num_base_bdevs_operational": 3, 00:15:03.572 "base_bdevs_list": [ 00:15:03.572 { 00:15:03.572 "name": null, 00:15:03.572 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:15:03.572 "is_configured": false, 00:15:03.572 "data_offset": 0, 00:15:03.572 "data_size": 65536 00:15:03.572 }, 00:15:03.572 { 00:15:03.572 "name": "BaseBdev2", 00:15:03.572 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:15:03.572 "is_configured": true, 00:15:03.572 "data_offset": 0, 00:15:03.572 "data_size": 65536 00:15:03.572 }, 00:15:03.572 { 00:15:03.572 "name": "BaseBdev3", 00:15:03.572 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:15:03.572 "is_configured": true, 00:15:03.572 "data_offset": 0, 00:15:03.572 "data_size": 65536 00:15:03.572 } 00:15:03.572 ] 00:15:03.572 }' 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.572 13:23:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.140 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:04.140 13:23:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.400 13:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:04.400 13:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.400 13:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 31bcdfff-c226-47ed-909d-134ce5b6e4c5 00:15:04.660 [2024-07-25 13:23:45.425302] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:04.660 [2024-07-25 13:23:45.425326] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x169e120 00:15:04.660 [2024-07-25 13:23:45.425330] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:15:04.660 [2024-07-25 13:23:45.425475] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16aa290 00:15:04.660 [2024-07-25 13:23:45.425568] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x169e120 00:15:04.660 [2024-07-25 13:23:45.425579] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x169e120 00:15:04.660 [2024-07-25 13:23:45.425700] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:04.660 NewBaseBdev 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:04.660 13:23:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:05.229 13:23:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:05.798 [ 00:15:05.798 { 00:15:05.798 "name": "NewBaseBdev", 00:15:05.798 "aliases": [ 00:15:05.798 "31bcdfff-c226-47ed-909d-134ce5b6e4c5" 00:15:05.798 ], 00:15:05.798 "product_name": "Malloc disk", 00:15:05.798 "block_size": 512, 00:15:05.798 "num_blocks": 65536, 00:15:05.798 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:15:05.798 "assigned_rate_limits": { 00:15:05.798 "rw_ios_per_sec": 0, 00:15:05.798 "rw_mbytes_per_sec": 0, 00:15:05.798 "r_mbytes_per_sec": 0, 00:15:05.798 "w_mbytes_per_sec": 0 00:15:05.798 }, 00:15:05.798 "claimed": true, 00:15:05.798 "claim_type": "exclusive_write", 00:15:05.798 "zoned": false, 00:15:05.798 "supported_io_types": { 00:15:05.798 "read": true, 00:15:05.798 "write": true, 00:15:05.798 "unmap": true, 00:15:05.798 "flush": true, 00:15:05.798 "reset": true, 00:15:05.798 "nvme_admin": false, 00:15:05.798 "nvme_io": false, 00:15:05.798 "nvme_io_md": false, 00:15:05.798 "write_zeroes": true, 00:15:05.798 "zcopy": true, 00:15:05.798 "get_zone_info": false, 00:15:05.798 "zone_management": false, 00:15:05.798 "zone_append": false, 00:15:05.798 "compare": false, 00:15:05.798 "compare_and_write": false, 00:15:05.798 "abort": true, 00:15:05.798 "seek_hole": false, 00:15:05.798 "seek_data": false, 00:15:05.798 "copy": true, 00:15:05.799 "nvme_iov_md": false 00:15:05.799 }, 00:15:05.799 "memory_domains": [ 00:15:05.799 { 00:15:05.799 "dma_device_id": "system", 00:15:05.799 "dma_device_type": 1 00:15:05.799 }, 00:15:05.799 { 00:15:05.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:05.799 "dma_device_type": 2 00:15:05.799 } 00:15:05.799 ], 00:15:05.799 "driver_specific": {} 00:15:05.799 } 00:15:05.799 ] 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.799 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.058 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.058 "name": "Existed_Raid", 00:15:06.058 "uuid": "17cf0f73-2bcb-4c15-bb4f-6a7ead407f87", 00:15:06.058 "strip_size_kb": 64, 00:15:06.058 "state": "online", 00:15:06.058 "raid_level": "concat", 00:15:06.058 "superblock": false, 00:15:06.058 "num_base_bdevs": 3, 00:15:06.058 "num_base_bdevs_discovered": 3, 00:15:06.058 "num_base_bdevs_operational": 3, 00:15:06.058 "base_bdevs_list": [ 00:15:06.058 { 00:15:06.058 "name": "NewBaseBdev", 00:15:06.058 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:15:06.058 "is_configured": true, 00:15:06.058 "data_offset": 0, 00:15:06.058 "data_size": 65536 00:15:06.058 }, 00:15:06.058 { 00:15:06.058 "name": "BaseBdev2", 00:15:06.058 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:15:06.058 "is_configured": true, 00:15:06.058 "data_offset": 0, 00:15:06.058 "data_size": 65536 00:15:06.058 }, 00:15:06.058 { 00:15:06.058 "name": "BaseBdev3", 00:15:06.058 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:15:06.059 "is_configured": true, 00:15:06.059 "data_offset": 0, 00:15:06.059 "data_size": 65536 00:15:06.059 } 00:15:06.059 ] 00:15:06.059 }' 00:15:06.059 13:23:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.059 13:23:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:06.626 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:06.626 [2024-07-25 13:23:47.402562] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:06.886 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:06.886 "name": "Existed_Raid", 00:15:06.886 "aliases": [ 00:15:06.886 "17cf0f73-2bcb-4c15-bb4f-6a7ead407f87" 00:15:06.886 ], 00:15:06.886 "product_name": "Raid Volume", 00:15:06.886 "block_size": 512, 00:15:06.886 "num_blocks": 196608, 00:15:06.886 "uuid": "17cf0f73-2bcb-4c15-bb4f-6a7ead407f87", 00:15:06.886 "assigned_rate_limits": { 00:15:06.886 "rw_ios_per_sec": 0, 00:15:06.886 "rw_mbytes_per_sec": 0, 00:15:06.886 "r_mbytes_per_sec": 0, 00:15:06.886 "w_mbytes_per_sec": 0 00:15:06.886 }, 00:15:06.886 "claimed": false, 00:15:06.886 "zoned": false, 00:15:06.886 "supported_io_types": { 00:15:06.886 "read": true, 00:15:06.886 "write": true, 00:15:06.886 "unmap": true, 00:15:06.886 "flush": true, 00:15:06.886 "reset": true, 00:15:06.886 "nvme_admin": false, 00:15:06.886 "nvme_io": false, 00:15:06.886 "nvme_io_md": false, 00:15:06.886 "write_zeroes": true, 00:15:06.886 "zcopy": false, 00:15:06.886 "get_zone_info": false, 00:15:06.886 "zone_management": false, 00:15:06.886 "zone_append": false, 00:15:06.886 "compare": false, 00:15:06.886 "compare_and_write": false, 00:15:06.886 "abort": false, 00:15:06.886 "seek_hole": false, 00:15:06.886 "seek_data": false, 00:15:06.886 "copy": false, 00:15:06.886 "nvme_iov_md": false 00:15:06.886 }, 00:15:06.886 "memory_domains": [ 00:15:06.886 { 00:15:06.886 "dma_device_id": "system", 00:15:06.886 "dma_device_type": 1 00:15:06.886 }, 00:15:06.886 { 00:15:06.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.886 "dma_device_type": 2 00:15:06.886 }, 00:15:06.886 { 00:15:06.886 "dma_device_id": "system", 00:15:06.886 "dma_device_type": 1 00:15:06.886 }, 00:15:06.886 { 00:15:06.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.886 "dma_device_type": 2 00:15:06.886 }, 00:15:06.886 { 00:15:06.886 "dma_device_id": "system", 00:15:06.886 "dma_device_type": 1 00:15:06.886 }, 00:15:06.886 { 00:15:06.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.886 "dma_device_type": 2 00:15:06.886 } 00:15:06.886 ], 00:15:06.886 "driver_specific": { 00:15:06.886 "raid": { 00:15:06.886 "uuid": "17cf0f73-2bcb-4c15-bb4f-6a7ead407f87", 00:15:06.886 "strip_size_kb": 64, 00:15:06.886 "state": "online", 00:15:06.886 "raid_level": "concat", 00:15:06.886 "superblock": false, 00:15:06.886 "num_base_bdevs": 3, 00:15:06.886 "num_base_bdevs_discovered": 3, 00:15:06.886 "num_base_bdevs_operational": 3, 00:15:06.886 "base_bdevs_list": [ 00:15:06.886 { 00:15:06.886 "name": "NewBaseBdev", 00:15:06.886 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:15:06.886 "is_configured": true, 00:15:06.886 "data_offset": 0, 00:15:06.886 "data_size": 65536 00:15:06.886 }, 00:15:06.886 { 00:15:06.886 "name": "BaseBdev2", 00:15:06.886 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:15:06.886 "is_configured": true, 00:15:06.886 "data_offset": 0, 00:15:06.886 "data_size": 65536 00:15:06.886 }, 00:15:06.886 { 00:15:06.886 "name": "BaseBdev3", 00:15:06.886 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:15:06.886 "is_configured": true, 00:15:06.886 "data_offset": 0, 00:15:06.886 "data_size": 65536 00:15:06.886 } 00:15:06.886 ] 00:15:06.886 } 00:15:06.886 } 00:15:06.886 }' 00:15:06.886 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:06.886 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:06.886 BaseBdev2 00:15:06.886 BaseBdev3' 00:15:06.887 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:06.887 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:06.887 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:06.887 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:06.887 "name": "NewBaseBdev", 00:15:06.887 "aliases": [ 00:15:06.887 "31bcdfff-c226-47ed-909d-134ce5b6e4c5" 00:15:06.887 ], 00:15:06.887 "product_name": "Malloc disk", 00:15:06.887 "block_size": 512, 00:15:06.887 "num_blocks": 65536, 00:15:06.887 "uuid": "31bcdfff-c226-47ed-909d-134ce5b6e4c5", 00:15:06.887 "assigned_rate_limits": { 00:15:06.887 "rw_ios_per_sec": 0, 00:15:06.887 "rw_mbytes_per_sec": 0, 00:15:06.887 "r_mbytes_per_sec": 0, 00:15:06.887 "w_mbytes_per_sec": 0 00:15:06.887 }, 00:15:06.887 "claimed": true, 00:15:06.887 "claim_type": "exclusive_write", 00:15:06.887 "zoned": false, 00:15:06.887 "supported_io_types": { 00:15:06.887 "read": true, 00:15:06.887 "write": true, 00:15:06.887 "unmap": true, 00:15:06.887 "flush": true, 00:15:06.887 "reset": true, 00:15:06.887 "nvme_admin": false, 00:15:06.887 "nvme_io": false, 00:15:06.887 "nvme_io_md": false, 00:15:06.887 "write_zeroes": true, 00:15:06.887 "zcopy": true, 00:15:06.887 "get_zone_info": false, 00:15:06.887 "zone_management": false, 00:15:06.887 "zone_append": false, 00:15:06.887 "compare": false, 00:15:06.887 "compare_and_write": false, 00:15:06.887 "abort": true, 00:15:06.887 "seek_hole": false, 00:15:06.887 "seek_data": false, 00:15:06.887 "copy": true, 00:15:06.887 "nvme_iov_md": false 00:15:06.887 }, 00:15:06.887 "memory_domains": [ 00:15:06.887 { 00:15:06.887 "dma_device_id": "system", 00:15:06.887 "dma_device_type": 1 00:15:06.887 }, 00:15:06.887 { 00:15:06.887 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.887 "dma_device_type": 2 00:15:06.887 } 00:15:06.887 ], 00:15:06.887 "driver_specific": {} 00:15:06.887 }' 00:15:06.887 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:07.146 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:07.405 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:07.405 13:23:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:07.405 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:07.405 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:07.405 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:07.405 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:07.405 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:07.405 "name": "BaseBdev2", 00:15:07.405 "aliases": [ 00:15:07.405 "cc1b58d8-3814-4725-a360-0df52150ec2b" 00:15:07.405 ], 00:15:07.405 "product_name": "Malloc disk", 00:15:07.405 "block_size": 512, 00:15:07.405 "num_blocks": 65536, 00:15:07.405 "uuid": "cc1b58d8-3814-4725-a360-0df52150ec2b", 00:15:07.405 "assigned_rate_limits": { 00:15:07.405 "rw_ios_per_sec": 0, 00:15:07.405 "rw_mbytes_per_sec": 0, 00:15:07.405 "r_mbytes_per_sec": 0, 00:15:07.405 "w_mbytes_per_sec": 0 00:15:07.405 }, 00:15:07.405 "claimed": true, 00:15:07.405 "claim_type": "exclusive_write", 00:15:07.405 "zoned": false, 00:15:07.405 "supported_io_types": { 00:15:07.405 "read": true, 00:15:07.405 "write": true, 00:15:07.405 "unmap": true, 00:15:07.405 "flush": true, 00:15:07.405 "reset": true, 00:15:07.405 "nvme_admin": false, 00:15:07.405 "nvme_io": false, 00:15:07.405 "nvme_io_md": false, 00:15:07.405 "write_zeroes": true, 00:15:07.405 "zcopy": true, 00:15:07.405 "get_zone_info": false, 00:15:07.405 "zone_management": false, 00:15:07.405 "zone_append": false, 00:15:07.405 "compare": false, 00:15:07.405 "compare_and_write": false, 00:15:07.405 "abort": true, 00:15:07.405 "seek_hole": false, 00:15:07.405 "seek_data": false, 00:15:07.405 "copy": true, 00:15:07.405 "nvme_iov_md": false 00:15:07.405 }, 00:15:07.405 "memory_domains": [ 00:15:07.405 { 00:15:07.405 "dma_device_id": "system", 00:15:07.405 "dma_device_type": 1 00:15:07.405 }, 00:15:07.405 { 00:15:07.405 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.405 "dma_device_type": 2 00:15:07.405 } 00:15:07.405 ], 00:15:07.405 "driver_specific": {} 00:15:07.405 }' 00:15:07.405 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:07.664 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:07.664 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:07.664 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:07.664 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:07.664 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:07.665 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:07.665 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:07.665 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:07.665 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:07.923 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:07.923 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:07.923 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:07.924 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:07.924 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:07.924 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:07.924 "name": "BaseBdev3", 00:15:07.924 "aliases": [ 00:15:07.924 "5637e11d-9905-4e2f-b5ed-a0ec06bb9127" 00:15:07.924 ], 00:15:07.924 "product_name": "Malloc disk", 00:15:07.924 "block_size": 512, 00:15:07.924 "num_blocks": 65536, 00:15:07.924 "uuid": "5637e11d-9905-4e2f-b5ed-a0ec06bb9127", 00:15:07.924 "assigned_rate_limits": { 00:15:07.924 "rw_ios_per_sec": 0, 00:15:07.924 "rw_mbytes_per_sec": 0, 00:15:07.924 "r_mbytes_per_sec": 0, 00:15:07.924 "w_mbytes_per_sec": 0 00:15:07.924 }, 00:15:07.924 "claimed": true, 00:15:07.924 "claim_type": "exclusive_write", 00:15:07.924 "zoned": false, 00:15:07.924 "supported_io_types": { 00:15:07.924 "read": true, 00:15:07.924 "write": true, 00:15:07.924 "unmap": true, 00:15:07.924 "flush": true, 00:15:07.924 "reset": true, 00:15:07.924 "nvme_admin": false, 00:15:07.924 "nvme_io": false, 00:15:07.924 "nvme_io_md": false, 00:15:07.924 "write_zeroes": true, 00:15:07.924 "zcopy": true, 00:15:07.924 "get_zone_info": false, 00:15:07.924 "zone_management": false, 00:15:07.924 "zone_append": false, 00:15:07.924 "compare": false, 00:15:07.924 "compare_and_write": false, 00:15:07.924 "abort": true, 00:15:07.924 "seek_hole": false, 00:15:07.924 "seek_data": false, 00:15:07.924 "copy": true, 00:15:07.924 "nvme_iov_md": false 00:15:07.924 }, 00:15:07.924 "memory_domains": [ 00:15:07.924 { 00:15:07.924 "dma_device_id": "system", 00:15:07.924 "dma_device_type": 1 00:15:07.924 }, 00:15:07.924 { 00:15:07.924 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:07.924 "dma_device_type": 2 00:15:07.924 } 00:15:07.924 ], 00:15:07.924 "driver_specific": {} 00:15:07.924 }' 00:15:07.924 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:08.183 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:08.183 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:08.183 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:08.183 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:08.183 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:08.183 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:08.183 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:08.442 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:08.442 13:23:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:08.442 13:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:08.442 13:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:08.442 13:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:08.701 [2024-07-25 13:23:49.247018] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:08.701 [2024-07-25 13:23:49.247033] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:08.701 [2024-07-25 13:23:49.247068] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:08.701 [2024-07-25 13:23:49.247104] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:08.701 [2024-07-25 13:23:49.247109] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x169e120 name Existed_Raid, state offline 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 909410 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 909410 ']' 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 909410 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 909410 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 909410' 00:15:08.701 killing process with pid 909410 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 909410 00:15:08.701 [2024-07-25 13:23:49.314740] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 909410 00:15:08.701 [2024-07-25 13:23:49.329464] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:08.701 00:15:08.701 real 0m33.040s 00:15:08.701 user 1m2.232s 00:15:08.701 sys 0m4.363s 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:08.701 13:23:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.701 ************************************ 00:15:08.701 END TEST raid_state_function_test 00:15:08.702 ************************************ 00:15:08.702 13:23:49 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:15:08.702 13:23:49 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:08.702 13:23:49 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:08.702 13:23:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:08.962 ************************************ 00:15:08.962 START TEST raid_state_function_test_sb 00:15:08.962 ************************************ 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 3 true 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=915507 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 915507' 00:15:08.962 Process raid pid: 915507 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 915507 /var/tmp/spdk-raid.sock 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 915507 ']' 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:08.962 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:08.962 13:23:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:08.962 [2024-07-25 13:23:49.587940] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:15:08.962 [2024-07-25 13:23:49.587991] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:08.962 [2024-07-25 13:23:49.675981] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:08.962 [2024-07-25 13:23:49.740724] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:09.222 [2024-07-25 13:23:49.779122] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:09.222 [2024-07-25 13:23:49.779143] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:09.791 13:23:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:09.791 13:23:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:15:09.791 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:10.051 [2024-07-25 13:23:50.586151] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:10.051 [2024-07-25 13:23:50.586181] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:10.051 [2024-07-25 13:23:50.586187] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:10.051 [2024-07-25 13:23:50.586193] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:10.051 [2024-07-25 13:23:50.586197] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:10.051 [2024-07-25 13:23:50.586202] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.051 "name": "Existed_Raid", 00:15:10.051 "uuid": "8a67b061-97e9-4004-abf3-c99dda8d6474", 00:15:10.051 "strip_size_kb": 64, 00:15:10.051 "state": "configuring", 00:15:10.051 "raid_level": "concat", 00:15:10.051 "superblock": true, 00:15:10.051 "num_base_bdevs": 3, 00:15:10.051 "num_base_bdevs_discovered": 0, 00:15:10.051 "num_base_bdevs_operational": 3, 00:15:10.051 "base_bdevs_list": [ 00:15:10.051 { 00:15:10.051 "name": "BaseBdev1", 00:15:10.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.051 "is_configured": false, 00:15:10.051 "data_offset": 0, 00:15:10.051 "data_size": 0 00:15:10.051 }, 00:15:10.051 { 00:15:10.051 "name": "BaseBdev2", 00:15:10.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.051 "is_configured": false, 00:15:10.051 "data_offset": 0, 00:15:10.051 "data_size": 0 00:15:10.051 }, 00:15:10.051 { 00:15:10.051 "name": "BaseBdev3", 00:15:10.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.051 "is_configured": false, 00:15:10.051 "data_offset": 0, 00:15:10.051 "data_size": 0 00:15:10.051 } 00:15:10.051 ] 00:15:10.051 }' 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.051 13:23:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:10.618 13:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:10.878 [2024-07-25 13:23:51.524401] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:10.878 [2024-07-25 13:23:51.524423] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d466d0 name Existed_Raid, state configuring 00:15:10.878 13:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:11.137 [2024-07-25 13:23:51.720924] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:11.137 [2024-07-25 13:23:51.720943] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:11.137 [2024-07-25 13:23:51.720948] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:11.137 [2024-07-25 13:23:51.720953] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:11.137 [2024-07-25 13:23:51.720957] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:11.137 [2024-07-25 13:23:51.720963] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:11.137 [2024-07-25 13:23:51.903976] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:11.137 BaseBdev1 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:11.137 13:23:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:11.397 13:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:11.657 [ 00:15:11.657 { 00:15:11.657 "name": "BaseBdev1", 00:15:11.657 "aliases": [ 00:15:11.657 "8fe0b287-b529-469b-90e1-695299e9b64e" 00:15:11.657 ], 00:15:11.657 "product_name": "Malloc disk", 00:15:11.657 "block_size": 512, 00:15:11.657 "num_blocks": 65536, 00:15:11.657 "uuid": "8fe0b287-b529-469b-90e1-695299e9b64e", 00:15:11.657 "assigned_rate_limits": { 00:15:11.657 "rw_ios_per_sec": 0, 00:15:11.657 "rw_mbytes_per_sec": 0, 00:15:11.657 "r_mbytes_per_sec": 0, 00:15:11.657 "w_mbytes_per_sec": 0 00:15:11.657 }, 00:15:11.657 "claimed": true, 00:15:11.657 "claim_type": "exclusive_write", 00:15:11.657 "zoned": false, 00:15:11.657 "supported_io_types": { 00:15:11.657 "read": true, 00:15:11.657 "write": true, 00:15:11.657 "unmap": true, 00:15:11.657 "flush": true, 00:15:11.657 "reset": true, 00:15:11.657 "nvme_admin": false, 00:15:11.657 "nvme_io": false, 00:15:11.657 "nvme_io_md": false, 00:15:11.657 "write_zeroes": true, 00:15:11.657 "zcopy": true, 00:15:11.657 "get_zone_info": false, 00:15:11.657 "zone_management": false, 00:15:11.657 "zone_append": false, 00:15:11.657 "compare": false, 00:15:11.657 "compare_and_write": false, 00:15:11.657 "abort": true, 00:15:11.657 "seek_hole": false, 00:15:11.657 "seek_data": false, 00:15:11.657 "copy": true, 00:15:11.657 "nvme_iov_md": false 00:15:11.657 }, 00:15:11.657 "memory_domains": [ 00:15:11.657 { 00:15:11.657 "dma_device_id": "system", 00:15:11.657 "dma_device_type": 1 00:15:11.657 }, 00:15:11.657 { 00:15:11.657 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:11.657 "dma_device_type": 2 00:15:11.657 } 00:15:11.657 ], 00:15:11.657 "driver_specific": {} 00:15:11.657 } 00:15:11.657 ] 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.657 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.916 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.916 "name": "Existed_Raid", 00:15:11.916 "uuid": "c11797be-874e-456e-8c53-cae4f72c6a57", 00:15:11.916 "strip_size_kb": 64, 00:15:11.916 "state": "configuring", 00:15:11.916 "raid_level": "concat", 00:15:11.916 "superblock": true, 00:15:11.916 "num_base_bdevs": 3, 00:15:11.916 "num_base_bdevs_discovered": 1, 00:15:11.916 "num_base_bdevs_operational": 3, 00:15:11.916 "base_bdevs_list": [ 00:15:11.916 { 00:15:11.916 "name": "BaseBdev1", 00:15:11.916 "uuid": "8fe0b287-b529-469b-90e1-695299e9b64e", 00:15:11.916 "is_configured": true, 00:15:11.916 "data_offset": 2048, 00:15:11.916 "data_size": 63488 00:15:11.916 }, 00:15:11.916 { 00:15:11.916 "name": "BaseBdev2", 00:15:11.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.916 "is_configured": false, 00:15:11.916 "data_offset": 0, 00:15:11.916 "data_size": 0 00:15:11.916 }, 00:15:11.916 { 00:15:11.916 "name": "BaseBdev3", 00:15:11.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.916 "is_configured": false, 00:15:11.916 "data_offset": 0, 00:15:11.916 "data_size": 0 00:15:11.916 } 00:15:11.916 ] 00:15:11.916 }' 00:15:11.916 13:23:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.916 13:23:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:12.484 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:12.484 [2024-07-25 13:23:53.219299] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:12.484 [2024-07-25 13:23:53.219324] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d45fa0 name Existed_Raid, state configuring 00:15:12.484 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:12.744 [2024-07-25 13:23:53.415831] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:12.744 [2024-07-25 13:23:53.416944] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:12.744 [2024-07-25 13:23:53.416966] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:12.744 [2024-07-25 13:23:53.416971] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:12.744 [2024-07-25 13:23:53.416977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:12.744 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.003 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.003 "name": "Existed_Raid", 00:15:13.003 "uuid": "53cd02c2-568c-4033-b9be-26533fe4e961", 00:15:13.003 "strip_size_kb": 64, 00:15:13.003 "state": "configuring", 00:15:13.003 "raid_level": "concat", 00:15:13.003 "superblock": true, 00:15:13.003 "num_base_bdevs": 3, 00:15:13.003 "num_base_bdevs_discovered": 1, 00:15:13.003 "num_base_bdevs_operational": 3, 00:15:13.003 "base_bdevs_list": [ 00:15:13.003 { 00:15:13.003 "name": "BaseBdev1", 00:15:13.003 "uuid": "8fe0b287-b529-469b-90e1-695299e9b64e", 00:15:13.003 "is_configured": true, 00:15:13.003 "data_offset": 2048, 00:15:13.003 "data_size": 63488 00:15:13.003 }, 00:15:13.003 { 00:15:13.003 "name": "BaseBdev2", 00:15:13.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.003 "is_configured": false, 00:15:13.003 "data_offset": 0, 00:15:13.003 "data_size": 0 00:15:13.003 }, 00:15:13.003 { 00:15:13.003 "name": "BaseBdev3", 00:15:13.003 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:13.003 "is_configured": false, 00:15:13.003 "data_offset": 0, 00:15:13.003 "data_size": 0 00:15:13.003 } 00:15:13.003 ] 00:15:13.003 }' 00:15:13.003 13:23:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.003 13:23:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:13.572 [2024-07-25 13:23:54.347185] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:13.572 BaseBdev2 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:13.572 13:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:14.140 13:23:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:14.400 [ 00:15:14.400 { 00:15:14.400 "name": "BaseBdev2", 00:15:14.400 "aliases": [ 00:15:14.400 "91dac954-a433-47d7-9302-94ba7ca22ce2" 00:15:14.400 ], 00:15:14.400 "product_name": "Malloc disk", 00:15:14.400 "block_size": 512, 00:15:14.400 "num_blocks": 65536, 00:15:14.400 "uuid": "91dac954-a433-47d7-9302-94ba7ca22ce2", 00:15:14.400 "assigned_rate_limits": { 00:15:14.400 "rw_ios_per_sec": 0, 00:15:14.400 "rw_mbytes_per_sec": 0, 00:15:14.400 "r_mbytes_per_sec": 0, 00:15:14.400 "w_mbytes_per_sec": 0 00:15:14.400 }, 00:15:14.400 "claimed": true, 00:15:14.400 "claim_type": "exclusive_write", 00:15:14.400 "zoned": false, 00:15:14.400 "supported_io_types": { 00:15:14.400 "read": true, 00:15:14.400 "write": true, 00:15:14.400 "unmap": true, 00:15:14.400 "flush": true, 00:15:14.400 "reset": true, 00:15:14.400 "nvme_admin": false, 00:15:14.400 "nvme_io": false, 00:15:14.400 "nvme_io_md": false, 00:15:14.400 "write_zeroes": true, 00:15:14.400 "zcopy": true, 00:15:14.400 "get_zone_info": false, 00:15:14.400 "zone_management": false, 00:15:14.400 "zone_append": false, 00:15:14.400 "compare": false, 00:15:14.400 "compare_and_write": false, 00:15:14.400 "abort": true, 00:15:14.400 "seek_hole": false, 00:15:14.400 "seek_data": false, 00:15:14.400 "copy": true, 00:15:14.400 "nvme_iov_md": false 00:15:14.400 }, 00:15:14.400 "memory_domains": [ 00:15:14.400 { 00:15:14.400 "dma_device_id": "system", 00:15:14.400 "dma_device_type": 1 00:15:14.400 }, 00:15:14.400 { 00:15:14.400 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.400 "dma_device_type": 2 00:15:14.400 } 00:15:14.400 ], 00:15:14.400 "driver_specific": {} 00:15:14.400 } 00:15:14.400 ] 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:14.400 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:14.659 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:14.659 "name": "Existed_Raid", 00:15:14.659 "uuid": "53cd02c2-568c-4033-b9be-26533fe4e961", 00:15:14.659 "strip_size_kb": 64, 00:15:14.659 "state": "configuring", 00:15:14.659 "raid_level": "concat", 00:15:14.659 "superblock": true, 00:15:14.659 "num_base_bdevs": 3, 00:15:14.659 "num_base_bdevs_discovered": 2, 00:15:14.659 "num_base_bdevs_operational": 3, 00:15:14.659 "base_bdevs_list": [ 00:15:14.659 { 00:15:14.659 "name": "BaseBdev1", 00:15:14.659 "uuid": "8fe0b287-b529-469b-90e1-695299e9b64e", 00:15:14.659 "is_configured": true, 00:15:14.659 "data_offset": 2048, 00:15:14.659 "data_size": 63488 00:15:14.659 }, 00:15:14.659 { 00:15:14.659 "name": "BaseBdev2", 00:15:14.659 "uuid": "91dac954-a433-47d7-9302-94ba7ca22ce2", 00:15:14.659 "is_configured": true, 00:15:14.659 "data_offset": 2048, 00:15:14.659 "data_size": 63488 00:15:14.659 }, 00:15:14.659 { 00:15:14.659 "name": "BaseBdev3", 00:15:14.659 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:14.659 "is_configured": false, 00:15:14.659 "data_offset": 0, 00:15:14.659 "data_size": 0 00:15:14.659 } 00:15:14.659 ] 00:15:14.659 }' 00:15:14.659 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:14.659 13:23:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:15.226 13:23:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:15.795 [2024-07-25 13:23:56.337212] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:15.795 [2024-07-25 13:23:56.337329] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d46ea0 00:15:15.795 [2024-07-25 13:23:56.337338] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:15.795 [2024-07-25 13:23:56.337475] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d46b70 00:15:15.795 [2024-07-25 13:23:56.337572] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d46ea0 00:15:15.795 [2024-07-25 13:23:56.337582] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d46ea0 00:15:15.795 [2024-07-25 13:23:56.337652] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:15.795 BaseBdev3 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:15.795 13:23:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:16.364 [ 00:15:16.364 { 00:15:16.364 "name": "BaseBdev3", 00:15:16.364 "aliases": [ 00:15:16.364 "379d8c52-abab-4b86-8c31-6200d27a654e" 00:15:16.364 ], 00:15:16.364 "product_name": "Malloc disk", 00:15:16.364 "block_size": 512, 00:15:16.364 "num_blocks": 65536, 00:15:16.364 "uuid": "379d8c52-abab-4b86-8c31-6200d27a654e", 00:15:16.364 "assigned_rate_limits": { 00:15:16.364 "rw_ios_per_sec": 0, 00:15:16.364 "rw_mbytes_per_sec": 0, 00:15:16.364 "r_mbytes_per_sec": 0, 00:15:16.364 "w_mbytes_per_sec": 0 00:15:16.364 }, 00:15:16.364 "claimed": true, 00:15:16.364 "claim_type": "exclusive_write", 00:15:16.364 "zoned": false, 00:15:16.364 "supported_io_types": { 00:15:16.364 "read": true, 00:15:16.364 "write": true, 00:15:16.364 "unmap": true, 00:15:16.364 "flush": true, 00:15:16.364 "reset": true, 00:15:16.364 "nvme_admin": false, 00:15:16.364 "nvme_io": false, 00:15:16.364 "nvme_io_md": false, 00:15:16.364 "write_zeroes": true, 00:15:16.364 "zcopy": true, 00:15:16.364 "get_zone_info": false, 00:15:16.364 "zone_management": false, 00:15:16.364 "zone_append": false, 00:15:16.364 "compare": false, 00:15:16.364 "compare_and_write": false, 00:15:16.364 "abort": true, 00:15:16.364 "seek_hole": false, 00:15:16.364 "seek_data": false, 00:15:16.364 "copy": true, 00:15:16.364 "nvme_iov_md": false 00:15:16.364 }, 00:15:16.364 "memory_domains": [ 00:15:16.364 { 00:15:16.364 "dma_device_id": "system", 00:15:16.364 "dma_device_type": 1 00:15:16.364 }, 00:15:16.364 { 00:15:16.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.364 "dma_device_type": 2 00:15:16.364 } 00:15:16.364 ], 00:15:16.364 "driver_specific": {} 00:15:16.364 } 00:15:16.364 ] 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:16.364 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:16.625 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:16.625 "name": "Existed_Raid", 00:15:16.625 "uuid": "53cd02c2-568c-4033-b9be-26533fe4e961", 00:15:16.625 "strip_size_kb": 64, 00:15:16.625 "state": "online", 00:15:16.625 "raid_level": "concat", 00:15:16.625 "superblock": true, 00:15:16.625 "num_base_bdevs": 3, 00:15:16.625 "num_base_bdevs_discovered": 3, 00:15:16.625 "num_base_bdevs_operational": 3, 00:15:16.625 "base_bdevs_list": [ 00:15:16.625 { 00:15:16.625 "name": "BaseBdev1", 00:15:16.625 "uuid": "8fe0b287-b529-469b-90e1-695299e9b64e", 00:15:16.625 "is_configured": true, 00:15:16.625 "data_offset": 2048, 00:15:16.625 "data_size": 63488 00:15:16.625 }, 00:15:16.625 { 00:15:16.625 "name": "BaseBdev2", 00:15:16.625 "uuid": "91dac954-a433-47d7-9302-94ba7ca22ce2", 00:15:16.625 "is_configured": true, 00:15:16.625 "data_offset": 2048, 00:15:16.625 "data_size": 63488 00:15:16.625 }, 00:15:16.625 { 00:15:16.625 "name": "BaseBdev3", 00:15:16.625 "uuid": "379d8c52-abab-4b86-8c31-6200d27a654e", 00:15:16.625 "is_configured": true, 00:15:16.625 "data_offset": 2048, 00:15:16.625 "data_size": 63488 00:15:16.625 } 00:15:16.625 ] 00:15:16.625 }' 00:15:16.625 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:16.625 13:23:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:17.194 13:23:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:17.454 [2024-07-25 13:23:58.029742] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:17.454 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:17.454 "name": "Existed_Raid", 00:15:17.454 "aliases": [ 00:15:17.454 "53cd02c2-568c-4033-b9be-26533fe4e961" 00:15:17.454 ], 00:15:17.454 "product_name": "Raid Volume", 00:15:17.454 "block_size": 512, 00:15:17.454 "num_blocks": 190464, 00:15:17.454 "uuid": "53cd02c2-568c-4033-b9be-26533fe4e961", 00:15:17.454 "assigned_rate_limits": { 00:15:17.454 "rw_ios_per_sec": 0, 00:15:17.454 "rw_mbytes_per_sec": 0, 00:15:17.454 "r_mbytes_per_sec": 0, 00:15:17.454 "w_mbytes_per_sec": 0 00:15:17.454 }, 00:15:17.454 "claimed": false, 00:15:17.454 "zoned": false, 00:15:17.454 "supported_io_types": { 00:15:17.454 "read": true, 00:15:17.454 "write": true, 00:15:17.454 "unmap": true, 00:15:17.454 "flush": true, 00:15:17.454 "reset": true, 00:15:17.454 "nvme_admin": false, 00:15:17.454 "nvme_io": false, 00:15:17.454 "nvme_io_md": false, 00:15:17.454 "write_zeroes": true, 00:15:17.454 "zcopy": false, 00:15:17.454 "get_zone_info": false, 00:15:17.454 "zone_management": false, 00:15:17.454 "zone_append": false, 00:15:17.454 "compare": false, 00:15:17.454 "compare_and_write": false, 00:15:17.454 "abort": false, 00:15:17.454 "seek_hole": false, 00:15:17.454 "seek_data": false, 00:15:17.454 "copy": false, 00:15:17.454 "nvme_iov_md": false 00:15:17.454 }, 00:15:17.454 "memory_domains": [ 00:15:17.454 { 00:15:17.454 "dma_device_id": "system", 00:15:17.454 "dma_device_type": 1 00:15:17.454 }, 00:15:17.454 { 00:15:17.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.454 "dma_device_type": 2 00:15:17.454 }, 00:15:17.454 { 00:15:17.454 "dma_device_id": "system", 00:15:17.454 "dma_device_type": 1 00:15:17.454 }, 00:15:17.454 { 00:15:17.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.454 "dma_device_type": 2 00:15:17.454 }, 00:15:17.454 { 00:15:17.454 "dma_device_id": "system", 00:15:17.454 "dma_device_type": 1 00:15:17.454 }, 00:15:17.454 { 00:15:17.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.454 "dma_device_type": 2 00:15:17.454 } 00:15:17.454 ], 00:15:17.454 "driver_specific": { 00:15:17.454 "raid": { 00:15:17.454 "uuid": "53cd02c2-568c-4033-b9be-26533fe4e961", 00:15:17.454 "strip_size_kb": 64, 00:15:17.454 "state": "online", 00:15:17.454 "raid_level": "concat", 00:15:17.454 "superblock": true, 00:15:17.454 "num_base_bdevs": 3, 00:15:17.454 "num_base_bdevs_discovered": 3, 00:15:17.454 "num_base_bdevs_operational": 3, 00:15:17.454 "base_bdevs_list": [ 00:15:17.454 { 00:15:17.454 "name": "BaseBdev1", 00:15:17.454 "uuid": "8fe0b287-b529-469b-90e1-695299e9b64e", 00:15:17.454 "is_configured": true, 00:15:17.454 "data_offset": 2048, 00:15:17.454 "data_size": 63488 00:15:17.454 }, 00:15:17.454 { 00:15:17.454 "name": "BaseBdev2", 00:15:17.454 "uuid": "91dac954-a433-47d7-9302-94ba7ca22ce2", 00:15:17.454 "is_configured": true, 00:15:17.454 "data_offset": 2048, 00:15:17.454 "data_size": 63488 00:15:17.454 }, 00:15:17.454 { 00:15:17.454 "name": "BaseBdev3", 00:15:17.454 "uuid": "379d8c52-abab-4b86-8c31-6200d27a654e", 00:15:17.454 "is_configured": true, 00:15:17.454 "data_offset": 2048, 00:15:17.454 "data_size": 63488 00:15:17.454 } 00:15:17.454 ] 00:15:17.454 } 00:15:17.454 } 00:15:17.454 }' 00:15:17.454 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:17.454 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:17.454 BaseBdev2 00:15:17.454 BaseBdev3' 00:15:17.454 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:17.454 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:17.454 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:17.714 "name": "BaseBdev1", 00:15:17.714 "aliases": [ 00:15:17.714 "8fe0b287-b529-469b-90e1-695299e9b64e" 00:15:17.714 ], 00:15:17.714 "product_name": "Malloc disk", 00:15:17.714 "block_size": 512, 00:15:17.714 "num_blocks": 65536, 00:15:17.714 "uuid": "8fe0b287-b529-469b-90e1-695299e9b64e", 00:15:17.714 "assigned_rate_limits": { 00:15:17.714 "rw_ios_per_sec": 0, 00:15:17.714 "rw_mbytes_per_sec": 0, 00:15:17.714 "r_mbytes_per_sec": 0, 00:15:17.714 "w_mbytes_per_sec": 0 00:15:17.714 }, 00:15:17.714 "claimed": true, 00:15:17.714 "claim_type": "exclusive_write", 00:15:17.714 "zoned": false, 00:15:17.714 "supported_io_types": { 00:15:17.714 "read": true, 00:15:17.714 "write": true, 00:15:17.714 "unmap": true, 00:15:17.714 "flush": true, 00:15:17.714 "reset": true, 00:15:17.714 "nvme_admin": false, 00:15:17.714 "nvme_io": false, 00:15:17.714 "nvme_io_md": false, 00:15:17.714 "write_zeroes": true, 00:15:17.714 "zcopy": true, 00:15:17.714 "get_zone_info": false, 00:15:17.714 "zone_management": false, 00:15:17.714 "zone_append": false, 00:15:17.714 "compare": false, 00:15:17.714 "compare_and_write": false, 00:15:17.714 "abort": true, 00:15:17.714 "seek_hole": false, 00:15:17.714 "seek_data": false, 00:15:17.714 "copy": true, 00:15:17.714 "nvme_iov_md": false 00:15:17.714 }, 00:15:17.714 "memory_domains": [ 00:15:17.714 { 00:15:17.714 "dma_device_id": "system", 00:15:17.714 "dma_device_type": 1 00:15:17.714 }, 00:15:17.714 { 00:15:17.714 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:17.714 "dma_device_type": 2 00:15:17.714 } 00:15:17.714 ], 00:15:17.714 "driver_specific": {} 00:15:17.714 }' 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:17.714 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:17.974 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:18.233 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:18.233 "name": "BaseBdev2", 00:15:18.233 "aliases": [ 00:15:18.233 "91dac954-a433-47d7-9302-94ba7ca22ce2" 00:15:18.233 ], 00:15:18.233 "product_name": "Malloc disk", 00:15:18.233 "block_size": 512, 00:15:18.233 "num_blocks": 65536, 00:15:18.233 "uuid": "91dac954-a433-47d7-9302-94ba7ca22ce2", 00:15:18.233 "assigned_rate_limits": { 00:15:18.233 "rw_ios_per_sec": 0, 00:15:18.233 "rw_mbytes_per_sec": 0, 00:15:18.233 "r_mbytes_per_sec": 0, 00:15:18.233 "w_mbytes_per_sec": 0 00:15:18.233 }, 00:15:18.233 "claimed": true, 00:15:18.233 "claim_type": "exclusive_write", 00:15:18.233 "zoned": false, 00:15:18.233 "supported_io_types": { 00:15:18.233 "read": true, 00:15:18.233 "write": true, 00:15:18.233 "unmap": true, 00:15:18.233 "flush": true, 00:15:18.233 "reset": true, 00:15:18.233 "nvme_admin": false, 00:15:18.233 "nvme_io": false, 00:15:18.233 "nvme_io_md": false, 00:15:18.233 "write_zeroes": true, 00:15:18.233 "zcopy": true, 00:15:18.233 "get_zone_info": false, 00:15:18.233 "zone_management": false, 00:15:18.233 "zone_append": false, 00:15:18.233 "compare": false, 00:15:18.233 "compare_and_write": false, 00:15:18.233 "abort": true, 00:15:18.233 "seek_hole": false, 00:15:18.233 "seek_data": false, 00:15:18.233 "copy": true, 00:15:18.233 "nvme_iov_md": false 00:15:18.233 }, 00:15:18.233 "memory_domains": [ 00:15:18.233 { 00:15:18.233 "dma_device_id": "system", 00:15:18.233 "dma_device_type": 1 00:15:18.233 }, 00:15:18.233 { 00:15:18.233 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.233 "dma_device_type": 2 00:15:18.233 } 00:15:18.233 ], 00:15:18.233 "driver_specific": {} 00:15:18.233 }' 00:15:18.233 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:18.233 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:18.233 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:18.233 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:18.233 13:23:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:18.233 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:18.233 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:18.492 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:18.752 "name": "BaseBdev3", 00:15:18.752 "aliases": [ 00:15:18.752 "379d8c52-abab-4b86-8c31-6200d27a654e" 00:15:18.752 ], 00:15:18.752 "product_name": "Malloc disk", 00:15:18.752 "block_size": 512, 00:15:18.752 "num_blocks": 65536, 00:15:18.752 "uuid": "379d8c52-abab-4b86-8c31-6200d27a654e", 00:15:18.752 "assigned_rate_limits": { 00:15:18.752 "rw_ios_per_sec": 0, 00:15:18.752 "rw_mbytes_per_sec": 0, 00:15:18.752 "r_mbytes_per_sec": 0, 00:15:18.752 "w_mbytes_per_sec": 0 00:15:18.752 }, 00:15:18.752 "claimed": true, 00:15:18.752 "claim_type": "exclusive_write", 00:15:18.752 "zoned": false, 00:15:18.752 "supported_io_types": { 00:15:18.752 "read": true, 00:15:18.752 "write": true, 00:15:18.752 "unmap": true, 00:15:18.752 "flush": true, 00:15:18.752 "reset": true, 00:15:18.752 "nvme_admin": false, 00:15:18.752 "nvme_io": false, 00:15:18.752 "nvme_io_md": false, 00:15:18.752 "write_zeroes": true, 00:15:18.752 "zcopy": true, 00:15:18.752 "get_zone_info": false, 00:15:18.752 "zone_management": false, 00:15:18.752 "zone_append": false, 00:15:18.752 "compare": false, 00:15:18.752 "compare_and_write": false, 00:15:18.752 "abort": true, 00:15:18.752 "seek_hole": false, 00:15:18.752 "seek_data": false, 00:15:18.752 "copy": true, 00:15:18.752 "nvme_iov_md": false 00:15:18.752 }, 00:15:18.752 "memory_domains": [ 00:15:18.752 { 00:15:18.752 "dma_device_id": "system", 00:15:18.752 "dma_device_type": 1 00:15:18.752 }, 00:15:18.752 { 00:15:18.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:18.752 "dma_device_type": 2 00:15:18.752 } 00:15:18.752 ], 00:15:18.752 "driver_specific": {} 00:15:18.752 }' 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:18.752 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.011 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:19.011 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:19.011 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.011 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:19.011 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:19.011 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:19.270 [2024-07-25 13:23:59.882236] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:19.271 [2024-07-25 13:23:59.882255] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:19.271 [2024-07-25 13:23:59.882287] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:19.271 13:23:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:19.529 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:19.529 "name": "Existed_Raid", 00:15:19.529 "uuid": "53cd02c2-568c-4033-b9be-26533fe4e961", 00:15:19.529 "strip_size_kb": 64, 00:15:19.529 "state": "offline", 00:15:19.529 "raid_level": "concat", 00:15:19.529 "superblock": true, 00:15:19.529 "num_base_bdevs": 3, 00:15:19.529 "num_base_bdevs_discovered": 2, 00:15:19.530 "num_base_bdevs_operational": 2, 00:15:19.530 "base_bdevs_list": [ 00:15:19.530 { 00:15:19.530 "name": null, 00:15:19.530 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:19.530 "is_configured": false, 00:15:19.530 "data_offset": 2048, 00:15:19.530 "data_size": 63488 00:15:19.530 }, 00:15:19.530 { 00:15:19.530 "name": "BaseBdev2", 00:15:19.530 "uuid": "91dac954-a433-47d7-9302-94ba7ca22ce2", 00:15:19.530 "is_configured": true, 00:15:19.530 "data_offset": 2048, 00:15:19.530 "data_size": 63488 00:15:19.530 }, 00:15:19.530 { 00:15:19.530 "name": "BaseBdev3", 00:15:19.530 "uuid": "379d8c52-abab-4b86-8c31-6200d27a654e", 00:15:19.530 "is_configured": true, 00:15:19.530 "data_offset": 2048, 00:15:19.530 "data_size": 63488 00:15:19.530 } 00:15:19.530 ] 00:15:19.530 }' 00:15:19.530 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:19.530 13:24:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:20.097 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:20.097 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:20.097 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.097 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:20.097 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:20.097 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:20.097 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:20.357 [2024-07-25 13:24:00.937247] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:20.357 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:20.358 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:20.358 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.358 13:24:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:20.358 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:20.358 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:20.358 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:20.617 [2024-07-25 13:24:01.324035] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:20.617 [2024-07-25 13:24:01.324065] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d46ea0 name Existed_Raid, state offline 00:15:20.617 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:20.617 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:20.617 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.617 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:20.876 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:20.876 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:20.876 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:20.876 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:20.876 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:20.876 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:21.136 BaseBdev2 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:21.136 13:24:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:21.395 [ 00:15:21.395 { 00:15:21.395 "name": "BaseBdev2", 00:15:21.395 "aliases": [ 00:15:21.395 "44560a75-b9c5-4819-8115-50b89b0ca717" 00:15:21.395 ], 00:15:21.395 "product_name": "Malloc disk", 00:15:21.395 "block_size": 512, 00:15:21.395 "num_blocks": 65536, 00:15:21.395 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:21.395 "assigned_rate_limits": { 00:15:21.395 "rw_ios_per_sec": 0, 00:15:21.395 "rw_mbytes_per_sec": 0, 00:15:21.395 "r_mbytes_per_sec": 0, 00:15:21.395 "w_mbytes_per_sec": 0 00:15:21.395 }, 00:15:21.395 "claimed": false, 00:15:21.395 "zoned": false, 00:15:21.395 "supported_io_types": { 00:15:21.395 "read": true, 00:15:21.395 "write": true, 00:15:21.395 "unmap": true, 00:15:21.395 "flush": true, 00:15:21.395 "reset": true, 00:15:21.395 "nvme_admin": false, 00:15:21.395 "nvme_io": false, 00:15:21.395 "nvme_io_md": false, 00:15:21.395 "write_zeroes": true, 00:15:21.395 "zcopy": true, 00:15:21.395 "get_zone_info": false, 00:15:21.395 "zone_management": false, 00:15:21.395 "zone_append": false, 00:15:21.395 "compare": false, 00:15:21.395 "compare_and_write": false, 00:15:21.395 "abort": true, 00:15:21.395 "seek_hole": false, 00:15:21.395 "seek_data": false, 00:15:21.395 "copy": true, 00:15:21.395 "nvme_iov_md": false 00:15:21.395 }, 00:15:21.395 "memory_domains": [ 00:15:21.395 { 00:15:21.395 "dma_device_id": "system", 00:15:21.395 "dma_device_type": 1 00:15:21.395 }, 00:15:21.395 { 00:15:21.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:21.395 "dma_device_type": 2 00:15:21.395 } 00:15:21.395 ], 00:15:21.395 "driver_specific": {} 00:15:21.395 } 00:15:21.395 ] 00:15:21.395 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:21.395 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:21.395 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:21.395 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:21.677 BaseBdev3 00:15:21.677 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:21.677 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:15:21.677 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:21.677 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:21.677 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:21.677 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:21.677 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:22.009 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:22.009 [ 00:15:22.009 { 00:15:22.009 "name": "BaseBdev3", 00:15:22.009 "aliases": [ 00:15:22.009 "eba697b9-4e9f-4684-b475-4d222ca6d06d" 00:15:22.009 ], 00:15:22.009 "product_name": "Malloc disk", 00:15:22.009 "block_size": 512, 00:15:22.009 "num_blocks": 65536, 00:15:22.009 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:22.009 "assigned_rate_limits": { 00:15:22.009 "rw_ios_per_sec": 0, 00:15:22.009 "rw_mbytes_per_sec": 0, 00:15:22.009 "r_mbytes_per_sec": 0, 00:15:22.009 "w_mbytes_per_sec": 0 00:15:22.009 }, 00:15:22.009 "claimed": false, 00:15:22.009 "zoned": false, 00:15:22.009 "supported_io_types": { 00:15:22.009 "read": true, 00:15:22.009 "write": true, 00:15:22.009 "unmap": true, 00:15:22.009 "flush": true, 00:15:22.009 "reset": true, 00:15:22.009 "nvme_admin": false, 00:15:22.009 "nvme_io": false, 00:15:22.009 "nvme_io_md": false, 00:15:22.009 "write_zeroes": true, 00:15:22.009 "zcopy": true, 00:15:22.009 "get_zone_info": false, 00:15:22.009 "zone_management": false, 00:15:22.009 "zone_append": false, 00:15:22.009 "compare": false, 00:15:22.009 "compare_and_write": false, 00:15:22.009 "abort": true, 00:15:22.009 "seek_hole": false, 00:15:22.009 "seek_data": false, 00:15:22.009 "copy": true, 00:15:22.009 "nvme_iov_md": false 00:15:22.009 }, 00:15:22.009 "memory_domains": [ 00:15:22.009 { 00:15:22.009 "dma_device_id": "system", 00:15:22.009 "dma_device_type": 1 00:15:22.009 }, 00:15:22.009 { 00:15:22.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:22.009 "dma_device_type": 2 00:15:22.009 } 00:15:22.009 ], 00:15:22.009 "driver_specific": {} 00:15:22.009 } 00:15:22.009 ] 00:15:22.009 13:24:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:22.009 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:22.009 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:22.009 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:22.268 [2024-07-25 13:24:02.859935] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:22.268 [2024-07-25 13:24:02.859963] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:22.268 [2024-07-25 13:24:02.859974] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:22.268 [2024-07-25 13:24:02.861016] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:22.268 13:24:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:22.268 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:22.268 "name": "Existed_Raid", 00:15:22.268 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:22.268 "strip_size_kb": 64, 00:15:22.268 "state": "configuring", 00:15:22.268 "raid_level": "concat", 00:15:22.268 "superblock": true, 00:15:22.268 "num_base_bdevs": 3, 00:15:22.268 "num_base_bdevs_discovered": 2, 00:15:22.268 "num_base_bdevs_operational": 3, 00:15:22.268 "base_bdevs_list": [ 00:15:22.268 { 00:15:22.268 "name": "BaseBdev1", 00:15:22.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:22.268 "is_configured": false, 00:15:22.268 "data_offset": 0, 00:15:22.268 "data_size": 0 00:15:22.268 }, 00:15:22.268 { 00:15:22.268 "name": "BaseBdev2", 00:15:22.268 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:22.268 "is_configured": true, 00:15:22.268 "data_offset": 2048, 00:15:22.268 "data_size": 63488 00:15:22.268 }, 00:15:22.268 { 00:15:22.268 "name": "BaseBdev3", 00:15:22.268 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:22.268 "is_configured": true, 00:15:22.268 "data_offset": 2048, 00:15:22.268 "data_size": 63488 00:15:22.268 } 00:15:22.268 ] 00:15:22.268 }' 00:15:22.268 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:22.268 13:24:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.836 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:23.095 [2024-07-25 13:24:03.762198] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.095 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.354 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.354 "name": "Existed_Raid", 00:15:23.354 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:23.354 "strip_size_kb": 64, 00:15:23.354 "state": "configuring", 00:15:23.354 "raid_level": "concat", 00:15:23.354 "superblock": true, 00:15:23.354 "num_base_bdevs": 3, 00:15:23.354 "num_base_bdevs_discovered": 1, 00:15:23.354 "num_base_bdevs_operational": 3, 00:15:23.354 "base_bdevs_list": [ 00:15:23.354 { 00:15:23.354 "name": "BaseBdev1", 00:15:23.354 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.354 "is_configured": false, 00:15:23.354 "data_offset": 0, 00:15:23.354 "data_size": 0 00:15:23.354 }, 00:15:23.354 { 00:15:23.354 "name": null, 00:15:23.354 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:23.354 "is_configured": false, 00:15:23.354 "data_offset": 2048, 00:15:23.354 "data_size": 63488 00:15:23.354 }, 00:15:23.354 { 00:15:23.354 "name": "BaseBdev3", 00:15:23.354 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:23.354 "is_configured": true, 00:15:23.354 "data_offset": 2048, 00:15:23.354 "data_size": 63488 00:15:23.354 } 00:15:23.354 ] 00:15:23.354 }' 00:15:23.354 13:24:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.354 13:24:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.922 13:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.922 13:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:23.922 13:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:23.922 13:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:24.181 [2024-07-25 13:24:04.841911] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:24.181 BaseBdev1 00:15:24.181 13:24:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:24.181 13:24:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:15:24.181 13:24:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:24.181 13:24:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:24.181 13:24:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:24.181 13:24:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:24.181 13:24:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.440 13:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:24.440 [ 00:15:24.440 { 00:15:24.440 "name": "BaseBdev1", 00:15:24.440 "aliases": [ 00:15:24.440 "bcba4239-8dcb-416c-9d70-3ed3acd44698" 00:15:24.440 ], 00:15:24.440 "product_name": "Malloc disk", 00:15:24.440 "block_size": 512, 00:15:24.440 "num_blocks": 65536, 00:15:24.440 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:24.440 "assigned_rate_limits": { 00:15:24.440 "rw_ios_per_sec": 0, 00:15:24.440 "rw_mbytes_per_sec": 0, 00:15:24.440 "r_mbytes_per_sec": 0, 00:15:24.440 "w_mbytes_per_sec": 0 00:15:24.440 }, 00:15:24.440 "claimed": true, 00:15:24.440 "claim_type": "exclusive_write", 00:15:24.440 "zoned": false, 00:15:24.440 "supported_io_types": { 00:15:24.440 "read": true, 00:15:24.440 "write": true, 00:15:24.440 "unmap": true, 00:15:24.440 "flush": true, 00:15:24.440 "reset": true, 00:15:24.440 "nvme_admin": false, 00:15:24.440 "nvme_io": false, 00:15:24.440 "nvme_io_md": false, 00:15:24.440 "write_zeroes": true, 00:15:24.440 "zcopy": true, 00:15:24.440 "get_zone_info": false, 00:15:24.440 "zone_management": false, 00:15:24.440 "zone_append": false, 00:15:24.440 "compare": false, 00:15:24.440 "compare_and_write": false, 00:15:24.440 "abort": true, 00:15:24.440 "seek_hole": false, 00:15:24.440 "seek_data": false, 00:15:24.440 "copy": true, 00:15:24.440 "nvme_iov_md": false 00:15:24.440 }, 00:15:24.440 "memory_domains": [ 00:15:24.440 { 00:15:24.440 "dma_device_id": "system", 00:15:24.440 "dma_device_type": 1 00:15:24.440 }, 00:15:24.440 { 00:15:24.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.440 "dma_device_type": 2 00:15:24.440 } 00:15:24.440 ], 00:15:24.440 "driver_specific": {} 00:15:24.440 } 00:15:24.440 ] 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.699 "name": "Existed_Raid", 00:15:24.699 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:24.699 "strip_size_kb": 64, 00:15:24.699 "state": "configuring", 00:15:24.699 "raid_level": "concat", 00:15:24.699 "superblock": true, 00:15:24.699 "num_base_bdevs": 3, 00:15:24.699 "num_base_bdevs_discovered": 2, 00:15:24.699 "num_base_bdevs_operational": 3, 00:15:24.699 "base_bdevs_list": [ 00:15:24.699 { 00:15:24.699 "name": "BaseBdev1", 00:15:24.699 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:24.699 "is_configured": true, 00:15:24.699 "data_offset": 2048, 00:15:24.699 "data_size": 63488 00:15:24.699 }, 00:15:24.699 { 00:15:24.699 "name": null, 00:15:24.699 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:24.699 "is_configured": false, 00:15:24.699 "data_offset": 2048, 00:15:24.699 "data_size": 63488 00:15:24.699 }, 00:15:24.699 { 00:15:24.699 "name": "BaseBdev3", 00:15:24.699 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:24.699 "is_configured": true, 00:15:24.699 "data_offset": 2048, 00:15:24.699 "data_size": 63488 00:15:24.699 } 00:15:24.699 ] 00:15:24.699 }' 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.699 13:24:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.267 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:25.267 13:24:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.526 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:25.526 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:25.785 [2024-07-25 13:24:06.321687] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:25.785 "name": "Existed_Raid", 00:15:25.785 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:25.785 "strip_size_kb": 64, 00:15:25.785 "state": "configuring", 00:15:25.785 "raid_level": "concat", 00:15:25.785 "superblock": true, 00:15:25.785 "num_base_bdevs": 3, 00:15:25.785 "num_base_bdevs_discovered": 1, 00:15:25.785 "num_base_bdevs_operational": 3, 00:15:25.785 "base_bdevs_list": [ 00:15:25.785 { 00:15:25.785 "name": "BaseBdev1", 00:15:25.785 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:25.785 "is_configured": true, 00:15:25.785 "data_offset": 2048, 00:15:25.785 "data_size": 63488 00:15:25.785 }, 00:15:25.785 { 00:15:25.785 "name": null, 00:15:25.785 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:25.785 "is_configured": false, 00:15:25.785 "data_offset": 2048, 00:15:25.785 "data_size": 63488 00:15:25.785 }, 00:15:25.785 { 00:15:25.785 "name": null, 00:15:25.785 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:25.785 "is_configured": false, 00:15:25.785 "data_offset": 2048, 00:15:25.785 "data_size": 63488 00:15:25.785 } 00:15:25.785 ] 00:15:25.785 }' 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:25.785 13:24:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:26.352 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.352 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:26.612 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:26.612 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:26.873 [2024-07-25 13:24:07.420495] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:26.873 "name": "Existed_Raid", 00:15:26.873 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:26.873 "strip_size_kb": 64, 00:15:26.873 "state": "configuring", 00:15:26.873 "raid_level": "concat", 00:15:26.873 "superblock": true, 00:15:26.873 "num_base_bdevs": 3, 00:15:26.873 "num_base_bdevs_discovered": 2, 00:15:26.873 "num_base_bdevs_operational": 3, 00:15:26.873 "base_bdevs_list": [ 00:15:26.873 { 00:15:26.873 "name": "BaseBdev1", 00:15:26.873 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:26.873 "is_configured": true, 00:15:26.873 "data_offset": 2048, 00:15:26.873 "data_size": 63488 00:15:26.873 }, 00:15:26.873 { 00:15:26.873 "name": null, 00:15:26.873 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:26.873 "is_configured": false, 00:15:26.873 "data_offset": 2048, 00:15:26.873 "data_size": 63488 00:15:26.873 }, 00:15:26.873 { 00:15:26.873 "name": "BaseBdev3", 00:15:26.873 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:26.873 "is_configured": true, 00:15:26.873 "data_offset": 2048, 00:15:26.873 "data_size": 63488 00:15:26.873 } 00:15:26.873 ] 00:15:26.873 }' 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:26.873 13:24:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:27.441 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.441 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:27.701 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:27.701 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:27.961 [2024-07-25 13:24:08.531321] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:27.961 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:27.961 "name": "Existed_Raid", 00:15:27.961 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:27.961 "strip_size_kb": 64, 00:15:27.961 "state": "configuring", 00:15:27.961 "raid_level": "concat", 00:15:27.961 "superblock": true, 00:15:27.961 "num_base_bdevs": 3, 00:15:27.961 "num_base_bdevs_discovered": 1, 00:15:27.961 "num_base_bdevs_operational": 3, 00:15:27.961 "base_bdevs_list": [ 00:15:27.961 { 00:15:27.961 "name": null, 00:15:27.961 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:27.961 "is_configured": false, 00:15:27.961 "data_offset": 2048, 00:15:27.961 "data_size": 63488 00:15:27.961 }, 00:15:27.961 { 00:15:27.961 "name": null, 00:15:27.961 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:27.961 "is_configured": false, 00:15:27.961 "data_offset": 2048, 00:15:27.961 "data_size": 63488 00:15:27.961 }, 00:15:27.961 { 00:15:27.961 "name": "BaseBdev3", 00:15:27.961 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:27.962 "is_configured": true, 00:15:27.962 "data_offset": 2048, 00:15:27.962 "data_size": 63488 00:15:27.962 } 00:15:27.962 ] 00:15:27.962 }' 00:15:27.962 13:24:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:27.962 13:24:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.531 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:28.531 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.791 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:28.791 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:29.051 [2024-07-25 13:24:09.668035] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.051 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:29.311 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:29.311 "name": "Existed_Raid", 00:15:29.311 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:29.311 "strip_size_kb": 64, 00:15:29.311 "state": "configuring", 00:15:29.311 "raid_level": "concat", 00:15:29.311 "superblock": true, 00:15:29.311 "num_base_bdevs": 3, 00:15:29.311 "num_base_bdevs_discovered": 2, 00:15:29.311 "num_base_bdevs_operational": 3, 00:15:29.311 "base_bdevs_list": [ 00:15:29.311 { 00:15:29.311 "name": null, 00:15:29.311 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:29.311 "is_configured": false, 00:15:29.311 "data_offset": 2048, 00:15:29.311 "data_size": 63488 00:15:29.311 }, 00:15:29.311 { 00:15:29.311 "name": "BaseBdev2", 00:15:29.311 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:29.311 "is_configured": true, 00:15:29.311 "data_offset": 2048, 00:15:29.311 "data_size": 63488 00:15:29.311 }, 00:15:29.311 { 00:15:29.311 "name": "BaseBdev3", 00:15:29.311 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:29.311 "is_configured": true, 00:15:29.311 "data_offset": 2048, 00:15:29.311 "data_size": 63488 00:15:29.311 } 00:15:29.311 ] 00:15:29.311 }' 00:15:29.311 13:24:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:29.311 13:24:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:29.881 13:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.881 13:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:29.881 13:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:29.882 13:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.882 13:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:30.142 13:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u bcba4239-8dcb-416c-9d70-3ed3acd44698 00:15:30.402 [2024-07-25 13:24:10.980384] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:30.402 [2024-07-25 13:24:10.980494] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1d47b40 00:15:30.402 [2024-07-25 13:24:10.980502] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:30.402 [2024-07-25 13:24:10.980642] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d46e70 00:15:30.402 [2024-07-25 13:24:10.980728] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1d47b40 00:15:30.402 [2024-07-25 13:24:10.980734] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1d47b40 00:15:30.402 [2024-07-25 13:24:10.980800] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:30.402 NewBaseBdev 00:15:30.402 13:24:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:30.402 13:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:15:30.402 13:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:15:30.402 13:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:15:30.402 13:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:15:30.402 13:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:15:30.402 13:24:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.402 13:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:30.662 [ 00:15:30.662 { 00:15:30.662 "name": "NewBaseBdev", 00:15:30.662 "aliases": [ 00:15:30.662 "bcba4239-8dcb-416c-9d70-3ed3acd44698" 00:15:30.662 ], 00:15:30.662 "product_name": "Malloc disk", 00:15:30.662 "block_size": 512, 00:15:30.662 "num_blocks": 65536, 00:15:30.662 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:30.662 "assigned_rate_limits": { 00:15:30.662 "rw_ios_per_sec": 0, 00:15:30.662 "rw_mbytes_per_sec": 0, 00:15:30.662 "r_mbytes_per_sec": 0, 00:15:30.662 "w_mbytes_per_sec": 0 00:15:30.662 }, 00:15:30.662 "claimed": true, 00:15:30.662 "claim_type": "exclusive_write", 00:15:30.662 "zoned": false, 00:15:30.662 "supported_io_types": { 00:15:30.662 "read": true, 00:15:30.662 "write": true, 00:15:30.662 "unmap": true, 00:15:30.662 "flush": true, 00:15:30.662 "reset": true, 00:15:30.662 "nvme_admin": false, 00:15:30.662 "nvme_io": false, 00:15:30.662 "nvme_io_md": false, 00:15:30.662 "write_zeroes": true, 00:15:30.662 "zcopy": true, 00:15:30.662 "get_zone_info": false, 00:15:30.662 "zone_management": false, 00:15:30.662 "zone_append": false, 00:15:30.662 "compare": false, 00:15:30.662 "compare_and_write": false, 00:15:30.662 "abort": true, 00:15:30.662 "seek_hole": false, 00:15:30.662 "seek_data": false, 00:15:30.662 "copy": true, 00:15:30.662 "nvme_iov_md": false 00:15:30.662 }, 00:15:30.662 "memory_domains": [ 00:15:30.662 { 00:15:30.662 "dma_device_id": "system", 00:15:30.662 "dma_device_type": 1 00:15:30.662 }, 00:15:30.662 { 00:15:30.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.662 "dma_device_type": 2 00:15:30.662 } 00:15:30.662 ], 00:15:30.662 "driver_specific": {} 00:15:30.662 } 00:15:30.662 ] 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:30.662 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:30.922 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:30.922 "name": "Existed_Raid", 00:15:30.922 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:30.922 "strip_size_kb": 64, 00:15:30.922 "state": "online", 00:15:30.922 "raid_level": "concat", 00:15:30.922 "superblock": true, 00:15:30.922 "num_base_bdevs": 3, 00:15:30.922 "num_base_bdevs_discovered": 3, 00:15:30.922 "num_base_bdevs_operational": 3, 00:15:30.922 "base_bdevs_list": [ 00:15:30.922 { 00:15:30.922 "name": "NewBaseBdev", 00:15:30.922 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:30.922 "is_configured": true, 00:15:30.922 "data_offset": 2048, 00:15:30.922 "data_size": 63488 00:15:30.922 }, 00:15:30.922 { 00:15:30.922 "name": "BaseBdev2", 00:15:30.922 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:30.922 "is_configured": true, 00:15:30.922 "data_offset": 2048, 00:15:30.922 "data_size": 63488 00:15:30.922 }, 00:15:30.922 { 00:15:30.922 "name": "BaseBdev3", 00:15:30.922 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:30.922 "is_configured": true, 00:15:30.922 "data_offset": 2048, 00:15:30.922 "data_size": 63488 00:15:30.922 } 00:15:30.922 ] 00:15:30.922 }' 00:15:30.922 13:24:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:30.922 13:24:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:31.493 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:31.493 [2024-07-25 13:24:12.267888] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:31.753 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:31.753 "name": "Existed_Raid", 00:15:31.753 "aliases": [ 00:15:31.753 "b3f9e655-663c-4a59-b1e0-4e14d1a894e2" 00:15:31.753 ], 00:15:31.753 "product_name": "Raid Volume", 00:15:31.753 "block_size": 512, 00:15:31.753 "num_blocks": 190464, 00:15:31.753 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:31.753 "assigned_rate_limits": { 00:15:31.753 "rw_ios_per_sec": 0, 00:15:31.753 "rw_mbytes_per_sec": 0, 00:15:31.753 "r_mbytes_per_sec": 0, 00:15:31.753 "w_mbytes_per_sec": 0 00:15:31.753 }, 00:15:31.753 "claimed": false, 00:15:31.753 "zoned": false, 00:15:31.753 "supported_io_types": { 00:15:31.753 "read": true, 00:15:31.753 "write": true, 00:15:31.753 "unmap": true, 00:15:31.753 "flush": true, 00:15:31.753 "reset": true, 00:15:31.753 "nvme_admin": false, 00:15:31.753 "nvme_io": false, 00:15:31.753 "nvme_io_md": false, 00:15:31.753 "write_zeroes": true, 00:15:31.753 "zcopy": false, 00:15:31.753 "get_zone_info": false, 00:15:31.753 "zone_management": false, 00:15:31.753 "zone_append": false, 00:15:31.753 "compare": false, 00:15:31.753 "compare_and_write": false, 00:15:31.753 "abort": false, 00:15:31.753 "seek_hole": false, 00:15:31.753 "seek_data": false, 00:15:31.753 "copy": false, 00:15:31.753 "nvme_iov_md": false 00:15:31.753 }, 00:15:31.753 "memory_domains": [ 00:15:31.753 { 00:15:31.753 "dma_device_id": "system", 00:15:31.753 "dma_device_type": 1 00:15:31.753 }, 00:15:31.753 { 00:15:31.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.753 "dma_device_type": 2 00:15:31.753 }, 00:15:31.753 { 00:15:31.753 "dma_device_id": "system", 00:15:31.753 "dma_device_type": 1 00:15:31.753 }, 00:15:31.753 { 00:15:31.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.753 "dma_device_type": 2 00:15:31.753 }, 00:15:31.753 { 00:15:31.753 "dma_device_id": "system", 00:15:31.753 "dma_device_type": 1 00:15:31.753 }, 00:15:31.753 { 00:15:31.753 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.753 "dma_device_type": 2 00:15:31.753 } 00:15:31.753 ], 00:15:31.753 "driver_specific": { 00:15:31.753 "raid": { 00:15:31.753 "uuid": "b3f9e655-663c-4a59-b1e0-4e14d1a894e2", 00:15:31.753 "strip_size_kb": 64, 00:15:31.753 "state": "online", 00:15:31.753 "raid_level": "concat", 00:15:31.754 "superblock": true, 00:15:31.754 "num_base_bdevs": 3, 00:15:31.754 "num_base_bdevs_discovered": 3, 00:15:31.754 "num_base_bdevs_operational": 3, 00:15:31.754 "base_bdevs_list": [ 00:15:31.754 { 00:15:31.754 "name": "NewBaseBdev", 00:15:31.754 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:31.754 "is_configured": true, 00:15:31.754 "data_offset": 2048, 00:15:31.754 "data_size": 63488 00:15:31.754 }, 00:15:31.754 { 00:15:31.754 "name": "BaseBdev2", 00:15:31.754 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:31.754 "is_configured": true, 00:15:31.754 "data_offset": 2048, 00:15:31.754 "data_size": 63488 00:15:31.754 }, 00:15:31.754 { 00:15:31.754 "name": "BaseBdev3", 00:15:31.754 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:31.754 "is_configured": true, 00:15:31.754 "data_offset": 2048, 00:15:31.754 "data_size": 63488 00:15:31.754 } 00:15:31.754 ] 00:15:31.754 } 00:15:31.754 } 00:15:31.754 }' 00:15:31.754 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:31.754 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:31.754 BaseBdev2 00:15:31.754 BaseBdev3' 00:15:31.754 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:31.754 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:31.754 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:31.754 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:31.754 "name": "NewBaseBdev", 00:15:31.754 "aliases": [ 00:15:31.754 "bcba4239-8dcb-416c-9d70-3ed3acd44698" 00:15:31.754 ], 00:15:31.754 "product_name": "Malloc disk", 00:15:31.754 "block_size": 512, 00:15:31.754 "num_blocks": 65536, 00:15:31.754 "uuid": "bcba4239-8dcb-416c-9d70-3ed3acd44698", 00:15:31.754 "assigned_rate_limits": { 00:15:31.754 "rw_ios_per_sec": 0, 00:15:31.754 "rw_mbytes_per_sec": 0, 00:15:31.754 "r_mbytes_per_sec": 0, 00:15:31.754 "w_mbytes_per_sec": 0 00:15:31.754 }, 00:15:31.754 "claimed": true, 00:15:31.754 "claim_type": "exclusive_write", 00:15:31.754 "zoned": false, 00:15:31.754 "supported_io_types": { 00:15:31.754 "read": true, 00:15:31.754 "write": true, 00:15:31.754 "unmap": true, 00:15:31.754 "flush": true, 00:15:31.754 "reset": true, 00:15:31.754 "nvme_admin": false, 00:15:31.754 "nvme_io": false, 00:15:31.754 "nvme_io_md": false, 00:15:31.754 "write_zeroes": true, 00:15:31.754 "zcopy": true, 00:15:31.754 "get_zone_info": false, 00:15:31.754 "zone_management": false, 00:15:31.754 "zone_append": false, 00:15:31.754 "compare": false, 00:15:31.754 "compare_and_write": false, 00:15:31.754 "abort": true, 00:15:31.754 "seek_hole": false, 00:15:31.754 "seek_data": false, 00:15:31.754 "copy": true, 00:15:31.754 "nvme_iov_md": false 00:15:31.754 }, 00:15:31.754 "memory_domains": [ 00:15:31.754 { 00:15:31.754 "dma_device_id": "system", 00:15:31.754 "dma_device_type": 1 00:15:31.754 }, 00:15:31.754 { 00:15:31.754 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.754 "dma_device_type": 2 00:15:31.754 } 00:15:31.754 ], 00:15:31.754 "driver_specific": {} 00:15:31.754 }' 00:15:31.754 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.013 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.014 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.273 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.273 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.273 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.273 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:32.274 13:24:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.274 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:32.274 "name": "BaseBdev2", 00:15:32.274 "aliases": [ 00:15:32.274 "44560a75-b9c5-4819-8115-50b89b0ca717" 00:15:32.274 ], 00:15:32.274 "product_name": "Malloc disk", 00:15:32.274 "block_size": 512, 00:15:32.274 "num_blocks": 65536, 00:15:32.274 "uuid": "44560a75-b9c5-4819-8115-50b89b0ca717", 00:15:32.274 "assigned_rate_limits": { 00:15:32.274 "rw_ios_per_sec": 0, 00:15:32.274 "rw_mbytes_per_sec": 0, 00:15:32.274 "r_mbytes_per_sec": 0, 00:15:32.274 "w_mbytes_per_sec": 0 00:15:32.274 }, 00:15:32.274 "claimed": true, 00:15:32.274 "claim_type": "exclusive_write", 00:15:32.274 "zoned": false, 00:15:32.274 "supported_io_types": { 00:15:32.274 "read": true, 00:15:32.274 "write": true, 00:15:32.274 "unmap": true, 00:15:32.274 "flush": true, 00:15:32.274 "reset": true, 00:15:32.274 "nvme_admin": false, 00:15:32.274 "nvme_io": false, 00:15:32.274 "nvme_io_md": false, 00:15:32.274 "write_zeroes": true, 00:15:32.274 "zcopy": true, 00:15:32.274 "get_zone_info": false, 00:15:32.274 "zone_management": false, 00:15:32.274 "zone_append": false, 00:15:32.274 "compare": false, 00:15:32.274 "compare_and_write": false, 00:15:32.274 "abort": true, 00:15:32.274 "seek_hole": false, 00:15:32.274 "seek_data": false, 00:15:32.274 "copy": true, 00:15:32.274 "nvme_iov_md": false 00:15:32.274 }, 00:15:32.274 "memory_domains": [ 00:15:32.274 { 00:15:32.274 "dma_device_id": "system", 00:15:32.274 "dma_device_type": 1 00:15:32.274 }, 00:15:32.274 { 00:15:32.274 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:32.274 "dma_device_type": 2 00:15:32.274 } 00:15:32.274 ], 00:15:32.274 "driver_specific": {} 00:15:32.274 }' 00:15:32.274 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.533 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:32.533 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:32.533 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.533 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:32.533 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:32.533 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.533 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:32.792 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:32.792 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.792 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:32.792 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:32.792 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:32.792 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:32.792 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:33.051 "name": "BaseBdev3", 00:15:33.051 "aliases": [ 00:15:33.051 "eba697b9-4e9f-4684-b475-4d222ca6d06d" 00:15:33.051 ], 00:15:33.051 "product_name": "Malloc disk", 00:15:33.051 "block_size": 512, 00:15:33.051 "num_blocks": 65536, 00:15:33.051 "uuid": "eba697b9-4e9f-4684-b475-4d222ca6d06d", 00:15:33.051 "assigned_rate_limits": { 00:15:33.051 "rw_ios_per_sec": 0, 00:15:33.051 "rw_mbytes_per_sec": 0, 00:15:33.051 "r_mbytes_per_sec": 0, 00:15:33.051 "w_mbytes_per_sec": 0 00:15:33.051 }, 00:15:33.051 "claimed": true, 00:15:33.051 "claim_type": "exclusive_write", 00:15:33.051 "zoned": false, 00:15:33.051 "supported_io_types": { 00:15:33.051 "read": true, 00:15:33.051 "write": true, 00:15:33.051 "unmap": true, 00:15:33.051 "flush": true, 00:15:33.051 "reset": true, 00:15:33.051 "nvme_admin": false, 00:15:33.051 "nvme_io": false, 00:15:33.051 "nvme_io_md": false, 00:15:33.051 "write_zeroes": true, 00:15:33.051 "zcopy": true, 00:15:33.051 "get_zone_info": false, 00:15:33.051 "zone_management": false, 00:15:33.051 "zone_append": false, 00:15:33.051 "compare": false, 00:15:33.051 "compare_and_write": false, 00:15:33.051 "abort": true, 00:15:33.051 "seek_hole": false, 00:15:33.051 "seek_data": false, 00:15:33.051 "copy": true, 00:15:33.051 "nvme_iov_md": false 00:15:33.051 }, 00:15:33.051 "memory_domains": [ 00:15:33.051 { 00:15:33.051 "dma_device_id": "system", 00:15:33.051 "dma_device_type": 1 00:15:33.051 }, 00:15:33.051 { 00:15:33.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:33.051 "dma_device_type": 2 00:15:33.051 } 00:15:33.051 ], 00:15:33.051 "driver_specific": {} 00:15:33.051 }' 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.051 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:33.309 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:33.309 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.309 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:33.309 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:33.309 13:24:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:33.568 [2024-07-25 13:24:14.144410] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:33.568 [2024-07-25 13:24:14.144427] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:33.568 [2024-07-25 13:24:14.144460] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:33.568 [2024-07-25 13:24:14.144496] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:33.568 [2024-07-25 13:24:14.144502] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1d47b40 name Existed_Raid, state offline 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 915507 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 915507 ']' 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 915507 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 915507 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 915507' 00:15:33.568 killing process with pid 915507 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 915507 00:15:33.568 [2024-07-25 13:24:14.211299] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 915507 00:15:33.568 [2024-07-25 13:24:14.226009] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:33.568 00:15:33.568 real 0m24.818s 00:15:33.568 user 0m46.503s 00:15:33.568 sys 0m3.562s 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:33.568 13:24:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:33.568 ************************************ 00:15:33.568 END TEST raid_state_function_test_sb 00:15:33.568 ************************************ 00:15:33.828 13:24:14 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:15:33.828 13:24:14 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:15:33.828 13:24:14 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:33.828 13:24:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:33.828 ************************************ 00:15:33.828 START TEST raid_superblock_test 00:15:33.828 ************************************ 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 3 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=920895 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 920895 /var/tmp/spdk-raid.sock 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 920895 ']' 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:33.828 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:33.828 13:24:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:33.828 [2024-07-25 13:24:14.486298] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:15:33.828 [2024-07-25 13:24:14.486350] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid920895 ] 00:15:33.828 [2024-07-25 13:24:14.577770] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:34.087 [2024-07-25 13:24:14.646016] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:34.087 [2024-07-25 13:24:14.689541] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:34.087 [2024-07-25 13:24:14.689566] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:34.655 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:34.914 malloc1 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:34.914 [2024-07-25 13:24:15.684496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:34.914 [2024-07-25 13:24:15.684530] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:34.914 [2024-07-25 13:24:15.684541] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25bb9b0 00:15:34.914 [2024-07-25 13:24:15.684552] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:34.914 [2024-07-25 13:24:15.685827] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:34.914 [2024-07-25 13:24:15.685849] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:34.914 pt1 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:34.914 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:35.174 malloc2 00:15:35.174 13:24:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:35.433 [2024-07-25 13:24:16.067503] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:35.433 [2024-07-25 13:24:16.067534] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.433 [2024-07-25 13:24:16.067543] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25bcdb0 00:15:35.433 [2024-07-25 13:24:16.067555] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.433 [2024-07-25 13:24:16.068779] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.433 [2024-07-25 13:24:16.068798] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:35.433 pt2 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:35.433 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:35.692 malloc3 00:15:35.692 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:35.692 [2024-07-25 13:24:16.454404] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:35.692 [2024-07-25 13:24:16.454435] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:35.692 [2024-07-25 13:24:16.454444] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2753780 00:15:35.692 [2024-07-25 13:24:16.454450] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:35.692 [2024-07-25 13:24:16.455646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:35.692 [2024-07-25 13:24:16.455666] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:35.692 pt3 00:15:35.692 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:15:35.692 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:15:35.692 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:35.951 [2024-07-25 13:24:16.646906] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:35.951 [2024-07-25 13:24:16.647909] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:35.951 [2024-07-25 13:24:16.647950] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:35.951 [2024-07-25 13:24:16.648056] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b42e0 00:15:35.951 [2024-07-25 13:24:16.648062] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:35.951 [2024-07-25 13:24:16.648213] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25bb680 00:15:35.951 [2024-07-25 13:24:16.648319] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b42e0 00:15:35.951 [2024-07-25 13:24:16.648324] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25b42e0 00:15:35.951 [2024-07-25 13:24:16.648407] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:35.951 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:35.951 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:35.951 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:35.951 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.952 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:36.211 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.211 "name": "raid_bdev1", 00:15:36.211 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:36.211 "strip_size_kb": 64, 00:15:36.211 "state": "online", 00:15:36.211 "raid_level": "concat", 00:15:36.211 "superblock": true, 00:15:36.211 "num_base_bdevs": 3, 00:15:36.211 "num_base_bdevs_discovered": 3, 00:15:36.211 "num_base_bdevs_operational": 3, 00:15:36.211 "base_bdevs_list": [ 00:15:36.211 { 00:15:36.211 "name": "pt1", 00:15:36.212 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:36.212 "is_configured": true, 00:15:36.212 "data_offset": 2048, 00:15:36.212 "data_size": 63488 00:15:36.212 }, 00:15:36.212 { 00:15:36.212 "name": "pt2", 00:15:36.212 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:36.212 "is_configured": true, 00:15:36.212 "data_offset": 2048, 00:15:36.212 "data_size": 63488 00:15:36.212 }, 00:15:36.212 { 00:15:36.212 "name": "pt3", 00:15:36.212 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:36.212 "is_configured": true, 00:15:36.212 "data_offset": 2048, 00:15:36.212 "data_size": 63488 00:15:36.212 } 00:15:36.212 ] 00:15:36.212 }' 00:15:36.212 13:24:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.212 13:24:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:36.779 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:36.779 [2024-07-25 13:24:17.569438] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:37.039 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:37.039 "name": "raid_bdev1", 00:15:37.039 "aliases": [ 00:15:37.039 "af18889c-c32d-4b92-ad1a-6eae610b5de7" 00:15:37.039 ], 00:15:37.039 "product_name": "Raid Volume", 00:15:37.039 "block_size": 512, 00:15:37.039 "num_blocks": 190464, 00:15:37.039 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:37.039 "assigned_rate_limits": { 00:15:37.039 "rw_ios_per_sec": 0, 00:15:37.039 "rw_mbytes_per_sec": 0, 00:15:37.039 "r_mbytes_per_sec": 0, 00:15:37.039 "w_mbytes_per_sec": 0 00:15:37.039 }, 00:15:37.039 "claimed": false, 00:15:37.039 "zoned": false, 00:15:37.039 "supported_io_types": { 00:15:37.039 "read": true, 00:15:37.039 "write": true, 00:15:37.039 "unmap": true, 00:15:37.039 "flush": true, 00:15:37.039 "reset": true, 00:15:37.039 "nvme_admin": false, 00:15:37.039 "nvme_io": false, 00:15:37.039 "nvme_io_md": false, 00:15:37.039 "write_zeroes": true, 00:15:37.039 "zcopy": false, 00:15:37.039 "get_zone_info": false, 00:15:37.039 "zone_management": false, 00:15:37.039 "zone_append": false, 00:15:37.039 "compare": false, 00:15:37.039 "compare_and_write": false, 00:15:37.039 "abort": false, 00:15:37.039 "seek_hole": false, 00:15:37.039 "seek_data": false, 00:15:37.039 "copy": false, 00:15:37.039 "nvme_iov_md": false 00:15:37.039 }, 00:15:37.039 "memory_domains": [ 00:15:37.039 { 00:15:37.039 "dma_device_id": "system", 00:15:37.039 "dma_device_type": 1 00:15:37.039 }, 00:15:37.039 { 00:15:37.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.039 "dma_device_type": 2 00:15:37.039 }, 00:15:37.039 { 00:15:37.039 "dma_device_id": "system", 00:15:37.039 "dma_device_type": 1 00:15:37.039 }, 00:15:37.039 { 00:15:37.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.039 "dma_device_type": 2 00:15:37.039 }, 00:15:37.039 { 00:15:37.039 "dma_device_id": "system", 00:15:37.039 "dma_device_type": 1 00:15:37.039 }, 00:15:37.039 { 00:15:37.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.039 "dma_device_type": 2 00:15:37.039 } 00:15:37.039 ], 00:15:37.039 "driver_specific": { 00:15:37.039 "raid": { 00:15:37.039 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:37.039 "strip_size_kb": 64, 00:15:37.039 "state": "online", 00:15:37.039 "raid_level": "concat", 00:15:37.039 "superblock": true, 00:15:37.039 "num_base_bdevs": 3, 00:15:37.039 "num_base_bdevs_discovered": 3, 00:15:37.039 "num_base_bdevs_operational": 3, 00:15:37.039 "base_bdevs_list": [ 00:15:37.039 { 00:15:37.039 "name": "pt1", 00:15:37.039 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.039 "is_configured": true, 00:15:37.039 "data_offset": 2048, 00:15:37.039 "data_size": 63488 00:15:37.039 }, 00:15:37.039 { 00:15:37.039 "name": "pt2", 00:15:37.039 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.039 "is_configured": true, 00:15:37.039 "data_offset": 2048, 00:15:37.039 "data_size": 63488 00:15:37.039 }, 00:15:37.039 { 00:15:37.039 "name": "pt3", 00:15:37.039 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:37.039 "is_configured": true, 00:15:37.039 "data_offset": 2048, 00:15:37.039 "data_size": 63488 00:15:37.039 } 00:15:37.039 ] 00:15:37.039 } 00:15:37.039 } 00:15:37.039 }' 00:15:37.039 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:37.039 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:37.039 pt2 00:15:37.039 pt3' 00:15:37.039 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:37.039 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:37.039 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:37.299 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:37.299 "name": "pt1", 00:15:37.299 "aliases": [ 00:15:37.299 "00000000-0000-0000-0000-000000000001" 00:15:37.299 ], 00:15:37.299 "product_name": "passthru", 00:15:37.299 "block_size": 512, 00:15:37.299 "num_blocks": 65536, 00:15:37.299 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:37.299 "assigned_rate_limits": { 00:15:37.299 "rw_ios_per_sec": 0, 00:15:37.299 "rw_mbytes_per_sec": 0, 00:15:37.299 "r_mbytes_per_sec": 0, 00:15:37.299 "w_mbytes_per_sec": 0 00:15:37.299 }, 00:15:37.299 "claimed": true, 00:15:37.299 "claim_type": "exclusive_write", 00:15:37.299 "zoned": false, 00:15:37.299 "supported_io_types": { 00:15:37.299 "read": true, 00:15:37.299 "write": true, 00:15:37.299 "unmap": true, 00:15:37.299 "flush": true, 00:15:37.299 "reset": true, 00:15:37.299 "nvme_admin": false, 00:15:37.299 "nvme_io": false, 00:15:37.299 "nvme_io_md": false, 00:15:37.299 "write_zeroes": true, 00:15:37.299 "zcopy": true, 00:15:37.299 "get_zone_info": false, 00:15:37.299 "zone_management": false, 00:15:37.299 "zone_append": false, 00:15:37.299 "compare": false, 00:15:37.299 "compare_and_write": false, 00:15:37.299 "abort": true, 00:15:37.299 "seek_hole": false, 00:15:37.299 "seek_data": false, 00:15:37.299 "copy": true, 00:15:37.299 "nvme_iov_md": false 00:15:37.299 }, 00:15:37.299 "memory_domains": [ 00:15:37.299 { 00:15:37.299 "dma_device_id": "system", 00:15:37.299 "dma_device_type": 1 00:15:37.299 }, 00:15:37.299 { 00:15:37.299 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.299 "dma_device_type": 2 00:15:37.299 } 00:15:37.299 ], 00:15:37.299 "driver_specific": { 00:15:37.299 "passthru": { 00:15:37.299 "name": "pt1", 00:15:37.299 "base_bdev_name": "malloc1" 00:15:37.299 } 00:15:37.299 } 00:15:37.299 }' 00:15:37.299 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.299 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.299 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:37.299 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.299 13:24:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.299 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:37.299 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.299 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.559 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:37.559 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.559 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:37.559 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:37.559 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:37.559 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:37.559 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:37.819 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:37.819 "name": "pt2", 00:15:37.819 "aliases": [ 00:15:37.819 "00000000-0000-0000-0000-000000000002" 00:15:37.819 ], 00:15:37.819 "product_name": "passthru", 00:15:37.819 "block_size": 512, 00:15:37.819 "num_blocks": 65536, 00:15:37.819 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:37.819 "assigned_rate_limits": { 00:15:37.819 "rw_ios_per_sec": 0, 00:15:37.819 "rw_mbytes_per_sec": 0, 00:15:37.819 "r_mbytes_per_sec": 0, 00:15:37.819 "w_mbytes_per_sec": 0 00:15:37.819 }, 00:15:37.819 "claimed": true, 00:15:37.819 "claim_type": "exclusive_write", 00:15:37.819 "zoned": false, 00:15:37.819 "supported_io_types": { 00:15:37.819 "read": true, 00:15:37.819 "write": true, 00:15:37.819 "unmap": true, 00:15:37.819 "flush": true, 00:15:37.819 "reset": true, 00:15:37.819 "nvme_admin": false, 00:15:37.819 "nvme_io": false, 00:15:37.819 "nvme_io_md": false, 00:15:37.819 "write_zeroes": true, 00:15:37.819 "zcopy": true, 00:15:37.819 "get_zone_info": false, 00:15:37.819 "zone_management": false, 00:15:37.819 "zone_append": false, 00:15:37.819 "compare": false, 00:15:37.820 "compare_and_write": false, 00:15:37.820 "abort": true, 00:15:37.820 "seek_hole": false, 00:15:37.820 "seek_data": false, 00:15:37.820 "copy": true, 00:15:37.820 "nvme_iov_md": false 00:15:37.820 }, 00:15:37.820 "memory_domains": [ 00:15:37.820 { 00:15:37.820 "dma_device_id": "system", 00:15:37.820 "dma_device_type": 1 00:15:37.820 }, 00:15:37.820 { 00:15:37.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:37.820 "dma_device_type": 2 00:15:37.820 } 00:15:37.820 ], 00:15:37.820 "driver_specific": { 00:15:37.820 "passthru": { 00:15:37.820 "name": "pt2", 00:15:37.820 "base_bdev_name": "malloc2" 00:15:37.820 } 00:15:37.820 } 00:15:37.820 }' 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:37.820 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.080 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:38.080 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.080 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.080 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:38.080 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:38.080 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:38.080 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:38.340 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:38.340 "name": "pt3", 00:15:38.340 "aliases": [ 00:15:38.340 "00000000-0000-0000-0000-000000000003" 00:15:38.340 ], 00:15:38.340 "product_name": "passthru", 00:15:38.340 "block_size": 512, 00:15:38.340 "num_blocks": 65536, 00:15:38.340 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:38.340 "assigned_rate_limits": { 00:15:38.340 "rw_ios_per_sec": 0, 00:15:38.340 "rw_mbytes_per_sec": 0, 00:15:38.340 "r_mbytes_per_sec": 0, 00:15:38.340 "w_mbytes_per_sec": 0 00:15:38.340 }, 00:15:38.340 "claimed": true, 00:15:38.340 "claim_type": "exclusive_write", 00:15:38.340 "zoned": false, 00:15:38.340 "supported_io_types": { 00:15:38.340 "read": true, 00:15:38.340 "write": true, 00:15:38.340 "unmap": true, 00:15:38.340 "flush": true, 00:15:38.340 "reset": true, 00:15:38.340 "nvme_admin": false, 00:15:38.340 "nvme_io": false, 00:15:38.340 "nvme_io_md": false, 00:15:38.340 "write_zeroes": true, 00:15:38.340 "zcopy": true, 00:15:38.340 "get_zone_info": false, 00:15:38.340 "zone_management": false, 00:15:38.340 "zone_append": false, 00:15:38.340 "compare": false, 00:15:38.340 "compare_and_write": false, 00:15:38.340 "abort": true, 00:15:38.340 "seek_hole": false, 00:15:38.340 "seek_data": false, 00:15:38.340 "copy": true, 00:15:38.340 "nvme_iov_md": false 00:15:38.340 }, 00:15:38.340 "memory_domains": [ 00:15:38.340 { 00:15:38.340 "dma_device_id": "system", 00:15:38.340 "dma_device_type": 1 00:15:38.340 }, 00:15:38.340 { 00:15:38.340 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:38.340 "dma_device_type": 2 00:15:38.340 } 00:15:38.340 ], 00:15:38.340 "driver_specific": { 00:15:38.340 "passthru": { 00:15:38.340 "name": "pt3", 00:15:38.340 "base_bdev_name": "malloc3" 00:15:38.340 } 00:15:38.340 } 00:15:38.340 }' 00:15:38.340 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.340 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:38.340 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:38.340 13:24:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.340 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:38.340 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:38.340 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.340 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:38.600 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:38.600 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.600 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:38.600 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:38.600 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:38.600 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:15:38.859 [2024-07-25 13:24:19.402068] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:38.859 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=af18889c-c32d-4b92-ad1a-6eae610b5de7 00:15:38.859 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z af18889c-c32d-4b92-ad1a-6eae610b5de7 ']' 00:15:38.859 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:38.859 [2024-07-25 13:24:19.602355] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:38.859 [2024-07-25 13:24:19.602370] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:38.859 [2024-07-25 13:24:19.602407] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:38.859 [2024-07-25 13:24:19.602447] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:38.859 [2024-07-25 13:24:19.602453] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b42e0 name raid_bdev1, state offline 00:15:38.859 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.859 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:15:39.119 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:15:39.119 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:15:39.119 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:39.119 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:39.379 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:39.379 13:24:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:39.638 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:15:39.638 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:39.638 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:39.638 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:39.898 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:40.158 [2024-07-25 13:24:20.745209] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:40.158 [2024-07-25 13:24:20.746273] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:40.158 [2024-07-25 13:24:20.746307] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:40.158 [2024-07-25 13:24:20.746340] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:40.158 [2024-07-25 13:24:20.746367] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:40.158 [2024-07-25 13:24:20.746381] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:40.158 [2024-07-25 13:24:20.746391] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:40.158 [2024-07-25 13:24:20.746397] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25bbe50 name raid_bdev1, state configuring 00:15:40.158 request: 00:15:40.158 { 00:15:40.158 "name": "raid_bdev1", 00:15:40.158 "raid_level": "concat", 00:15:40.158 "base_bdevs": [ 00:15:40.158 "malloc1", 00:15:40.158 "malloc2", 00:15:40.158 "malloc3" 00:15:40.158 ], 00:15:40.158 "strip_size_kb": 64, 00:15:40.158 "superblock": false, 00:15:40.158 "method": "bdev_raid_create", 00:15:40.158 "req_id": 1 00:15:40.158 } 00:15:40.158 Got JSON-RPC error response 00:15:40.158 response: 00:15:40.158 { 00:15:40.158 "code": -17, 00:15:40.158 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:40.158 } 00:15:40.158 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:15:40.158 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:15:40.158 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:15:40.158 13:24:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:15:40.158 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.158 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:15:40.417 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:15:40.417 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:15:40.417 13:24:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:40.417 [2024-07-25 13:24:21.114088] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:40.417 [2024-07-25 13:24:21.114107] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:40.417 [2024-07-25 13:24:21.114118] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25bbbe0 00:15:40.417 [2024-07-25 13:24:21.114124] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:40.417 [2024-07-25 13:24:21.115346] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:40.417 [2024-07-25 13:24:21.115365] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:40.417 [2024-07-25 13:24:21.115407] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:40.417 [2024-07-25 13:24:21.115425] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:40.417 pt1 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.417 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.418 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.418 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.418 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.418 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:40.677 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.677 "name": "raid_bdev1", 00:15:40.677 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:40.677 "strip_size_kb": 64, 00:15:40.677 "state": "configuring", 00:15:40.677 "raid_level": "concat", 00:15:40.677 "superblock": true, 00:15:40.677 "num_base_bdevs": 3, 00:15:40.677 "num_base_bdevs_discovered": 1, 00:15:40.677 "num_base_bdevs_operational": 3, 00:15:40.677 "base_bdevs_list": [ 00:15:40.677 { 00:15:40.677 "name": "pt1", 00:15:40.677 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:40.677 "is_configured": true, 00:15:40.677 "data_offset": 2048, 00:15:40.677 "data_size": 63488 00:15:40.677 }, 00:15:40.677 { 00:15:40.677 "name": null, 00:15:40.677 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:40.677 "is_configured": false, 00:15:40.677 "data_offset": 2048, 00:15:40.677 "data_size": 63488 00:15:40.677 }, 00:15:40.677 { 00:15:40.677 "name": null, 00:15:40.677 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:40.677 "is_configured": false, 00:15:40.677 "data_offset": 2048, 00:15:40.677 "data_size": 63488 00:15:40.677 } 00:15:40.677 ] 00:15:40.677 }' 00:15:40.677 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.677 13:24:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:41.247 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:15:41.247 13:24:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:41.247 [2024-07-25 13:24:22.016372] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:41.247 [2024-07-25 13:24:22.016401] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:41.247 [2024-07-25 13:24:22.016410] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x275f410 00:15:41.247 [2024-07-25 13:24:22.016416] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:41.247 [2024-07-25 13:24:22.016689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:41.247 [2024-07-25 13:24:22.016703] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:41.247 [2024-07-25 13:24:22.016745] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:41.247 [2024-07-25 13:24:22.016758] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:41.247 pt2 00:15:41.247 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:41.506 [2024-07-25 13:24:22.192829] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:41.506 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:15:41.506 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:41.506 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:41.506 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:41.506 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:41.506 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:41.506 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:41.507 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:41.507 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:41.507 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:41.507 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.507 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:41.766 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:41.766 "name": "raid_bdev1", 00:15:41.766 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:41.766 "strip_size_kb": 64, 00:15:41.766 "state": "configuring", 00:15:41.766 "raid_level": "concat", 00:15:41.766 "superblock": true, 00:15:41.766 "num_base_bdevs": 3, 00:15:41.766 "num_base_bdevs_discovered": 1, 00:15:41.766 "num_base_bdevs_operational": 3, 00:15:41.766 "base_bdevs_list": [ 00:15:41.766 { 00:15:41.766 "name": "pt1", 00:15:41.766 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:41.766 "is_configured": true, 00:15:41.766 "data_offset": 2048, 00:15:41.766 "data_size": 63488 00:15:41.766 }, 00:15:41.766 { 00:15:41.766 "name": null, 00:15:41.766 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:41.766 "is_configured": false, 00:15:41.766 "data_offset": 2048, 00:15:41.766 "data_size": 63488 00:15:41.766 }, 00:15:41.766 { 00:15:41.766 "name": null, 00:15:41.766 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:41.766 "is_configured": false, 00:15:41.766 "data_offset": 2048, 00:15:41.766 "data_size": 63488 00:15:41.766 } 00:15:41.766 ] 00:15:41.766 }' 00:15:41.766 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:41.766 13:24:22 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:42.336 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:15:42.336 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:42.336 13:24:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:42.336 [2024-07-25 13:24:23.127191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:42.336 [2024-07-25 13:24:23.127222] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.336 [2024-07-25 13:24:23.127232] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25bbe50 00:15:42.336 [2024-07-25 13:24:23.127238] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.336 [2024-07-25 13:24:23.127501] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.336 [2024-07-25 13:24:23.127513] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:42.336 [2024-07-25 13:24:23.127565] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:42.336 [2024-07-25 13:24:23.127577] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:42.596 pt2 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:42.596 [2024-07-25 13:24:23.315666] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:42.596 [2024-07-25 13:24:23.315689] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:42.596 [2024-07-25 13:24:23.315697] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25b4e30 00:15:42.596 [2024-07-25 13:24:23.315703] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:42.596 [2024-07-25 13:24:23.315934] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:42.596 [2024-07-25 13:24:23.315944] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:42.596 [2024-07-25 13:24:23.315979] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:42.596 [2024-07-25 13:24:23.315990] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:42.596 [2024-07-25 13:24:23.316070] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x25b24a0 00:15:42.596 [2024-07-25 13:24:23.316076] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:42.596 [2024-07-25 13:24:23.316205] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2755070 00:15:42.596 [2024-07-25 13:24:23.316302] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25b24a0 00:15:42.596 [2024-07-25 13:24:23.316307] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25b24a0 00:15:42.596 [2024-07-25 13:24:23.316377] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:42.596 pt3 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:42.596 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.597 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:42.885 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.885 "name": "raid_bdev1", 00:15:42.885 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:42.885 "strip_size_kb": 64, 00:15:42.885 "state": "online", 00:15:42.885 "raid_level": "concat", 00:15:42.885 "superblock": true, 00:15:42.885 "num_base_bdevs": 3, 00:15:42.885 "num_base_bdevs_discovered": 3, 00:15:42.885 "num_base_bdevs_operational": 3, 00:15:42.885 "base_bdevs_list": [ 00:15:42.885 { 00:15:42.885 "name": "pt1", 00:15:42.885 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:42.885 "is_configured": true, 00:15:42.885 "data_offset": 2048, 00:15:42.885 "data_size": 63488 00:15:42.885 }, 00:15:42.885 { 00:15:42.885 "name": "pt2", 00:15:42.885 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:42.885 "is_configured": true, 00:15:42.885 "data_offset": 2048, 00:15:42.885 "data_size": 63488 00:15:42.885 }, 00:15:42.885 { 00:15:42.885 "name": "pt3", 00:15:42.885 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:42.885 "is_configured": true, 00:15:42.885 "data_offset": 2048, 00:15:42.885 "data_size": 63488 00:15:42.885 } 00:15:42.885 ] 00:15:42.885 }' 00:15:42.885 13:24:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.885 13:24:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:43.453 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:43.712 [2024-07-25 13:24:24.246236] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:43.712 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:43.712 "name": "raid_bdev1", 00:15:43.712 "aliases": [ 00:15:43.712 "af18889c-c32d-4b92-ad1a-6eae610b5de7" 00:15:43.712 ], 00:15:43.712 "product_name": "Raid Volume", 00:15:43.712 "block_size": 512, 00:15:43.712 "num_blocks": 190464, 00:15:43.713 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:43.713 "assigned_rate_limits": { 00:15:43.713 "rw_ios_per_sec": 0, 00:15:43.713 "rw_mbytes_per_sec": 0, 00:15:43.713 "r_mbytes_per_sec": 0, 00:15:43.713 "w_mbytes_per_sec": 0 00:15:43.713 }, 00:15:43.713 "claimed": false, 00:15:43.713 "zoned": false, 00:15:43.713 "supported_io_types": { 00:15:43.713 "read": true, 00:15:43.713 "write": true, 00:15:43.713 "unmap": true, 00:15:43.713 "flush": true, 00:15:43.713 "reset": true, 00:15:43.713 "nvme_admin": false, 00:15:43.713 "nvme_io": false, 00:15:43.713 "nvme_io_md": false, 00:15:43.713 "write_zeroes": true, 00:15:43.713 "zcopy": false, 00:15:43.713 "get_zone_info": false, 00:15:43.713 "zone_management": false, 00:15:43.713 "zone_append": false, 00:15:43.713 "compare": false, 00:15:43.713 "compare_and_write": false, 00:15:43.713 "abort": false, 00:15:43.713 "seek_hole": false, 00:15:43.713 "seek_data": false, 00:15:43.713 "copy": false, 00:15:43.713 "nvme_iov_md": false 00:15:43.713 }, 00:15:43.713 "memory_domains": [ 00:15:43.713 { 00:15:43.713 "dma_device_id": "system", 00:15:43.713 "dma_device_type": 1 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.713 "dma_device_type": 2 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "dma_device_id": "system", 00:15:43.713 "dma_device_type": 1 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.713 "dma_device_type": 2 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "dma_device_id": "system", 00:15:43.713 "dma_device_type": 1 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.713 "dma_device_type": 2 00:15:43.713 } 00:15:43.713 ], 00:15:43.713 "driver_specific": { 00:15:43.713 "raid": { 00:15:43.713 "uuid": "af18889c-c32d-4b92-ad1a-6eae610b5de7", 00:15:43.713 "strip_size_kb": 64, 00:15:43.713 "state": "online", 00:15:43.713 "raid_level": "concat", 00:15:43.713 "superblock": true, 00:15:43.713 "num_base_bdevs": 3, 00:15:43.713 "num_base_bdevs_discovered": 3, 00:15:43.713 "num_base_bdevs_operational": 3, 00:15:43.713 "base_bdevs_list": [ 00:15:43.713 { 00:15:43.713 "name": "pt1", 00:15:43.713 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:43.713 "is_configured": true, 00:15:43.713 "data_offset": 2048, 00:15:43.713 "data_size": 63488 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "name": "pt2", 00:15:43.713 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:43.713 "is_configured": true, 00:15:43.713 "data_offset": 2048, 00:15:43.713 "data_size": 63488 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "name": "pt3", 00:15:43.713 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:43.713 "is_configured": true, 00:15:43.713 "data_offset": 2048, 00:15:43.713 "data_size": 63488 00:15:43.713 } 00:15:43.713 ] 00:15:43.713 } 00:15:43.713 } 00:15:43.713 }' 00:15:43.713 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:43.713 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:43.713 pt2 00:15:43.713 pt3' 00:15:43.713 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.713 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:43.713 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.713 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.713 "name": "pt1", 00:15:43.713 "aliases": [ 00:15:43.713 "00000000-0000-0000-0000-000000000001" 00:15:43.713 ], 00:15:43.713 "product_name": "passthru", 00:15:43.713 "block_size": 512, 00:15:43.713 "num_blocks": 65536, 00:15:43.713 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:43.713 "assigned_rate_limits": { 00:15:43.713 "rw_ios_per_sec": 0, 00:15:43.713 "rw_mbytes_per_sec": 0, 00:15:43.713 "r_mbytes_per_sec": 0, 00:15:43.713 "w_mbytes_per_sec": 0 00:15:43.713 }, 00:15:43.713 "claimed": true, 00:15:43.713 "claim_type": "exclusive_write", 00:15:43.713 "zoned": false, 00:15:43.713 "supported_io_types": { 00:15:43.713 "read": true, 00:15:43.713 "write": true, 00:15:43.713 "unmap": true, 00:15:43.713 "flush": true, 00:15:43.713 "reset": true, 00:15:43.713 "nvme_admin": false, 00:15:43.713 "nvme_io": false, 00:15:43.713 "nvme_io_md": false, 00:15:43.713 "write_zeroes": true, 00:15:43.713 "zcopy": true, 00:15:43.713 "get_zone_info": false, 00:15:43.713 "zone_management": false, 00:15:43.713 "zone_append": false, 00:15:43.713 "compare": false, 00:15:43.713 "compare_and_write": false, 00:15:43.713 "abort": true, 00:15:43.713 "seek_hole": false, 00:15:43.713 "seek_data": false, 00:15:43.713 "copy": true, 00:15:43.713 "nvme_iov_md": false 00:15:43.713 }, 00:15:43.713 "memory_domains": [ 00:15:43.713 { 00:15:43.713 "dma_device_id": "system", 00:15:43.713 "dma_device_type": 1 00:15:43.713 }, 00:15:43.713 { 00:15:43.713 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.713 "dma_device_type": 2 00:15:43.713 } 00:15:43.713 ], 00:15:43.713 "driver_specific": { 00:15:43.713 "passthru": { 00:15:43.713 "name": "pt1", 00:15:43.713 "base_bdev_name": "malloc1" 00:15:43.713 } 00:15:43.713 } 00:15:43.713 }' 00:15:43.972 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.972 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.972 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.972 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.973 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.973 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.973 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.973 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.232 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:44.232 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.232 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.232 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:44.232 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:44.232 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:44.232 13:24:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:44.492 "name": "pt2", 00:15:44.492 "aliases": [ 00:15:44.492 "00000000-0000-0000-0000-000000000002" 00:15:44.492 ], 00:15:44.492 "product_name": "passthru", 00:15:44.492 "block_size": 512, 00:15:44.492 "num_blocks": 65536, 00:15:44.492 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:44.492 "assigned_rate_limits": { 00:15:44.492 "rw_ios_per_sec": 0, 00:15:44.492 "rw_mbytes_per_sec": 0, 00:15:44.492 "r_mbytes_per_sec": 0, 00:15:44.492 "w_mbytes_per_sec": 0 00:15:44.492 }, 00:15:44.492 "claimed": true, 00:15:44.492 "claim_type": "exclusive_write", 00:15:44.492 "zoned": false, 00:15:44.492 "supported_io_types": { 00:15:44.492 "read": true, 00:15:44.492 "write": true, 00:15:44.492 "unmap": true, 00:15:44.492 "flush": true, 00:15:44.492 "reset": true, 00:15:44.492 "nvme_admin": false, 00:15:44.492 "nvme_io": false, 00:15:44.492 "nvme_io_md": false, 00:15:44.492 "write_zeroes": true, 00:15:44.492 "zcopy": true, 00:15:44.492 "get_zone_info": false, 00:15:44.492 "zone_management": false, 00:15:44.492 "zone_append": false, 00:15:44.492 "compare": false, 00:15:44.492 "compare_and_write": false, 00:15:44.492 "abort": true, 00:15:44.492 "seek_hole": false, 00:15:44.492 "seek_data": false, 00:15:44.492 "copy": true, 00:15:44.492 "nvme_iov_md": false 00:15:44.492 }, 00:15:44.492 "memory_domains": [ 00:15:44.492 { 00:15:44.492 "dma_device_id": "system", 00:15:44.492 "dma_device_type": 1 00:15:44.492 }, 00:15:44.492 { 00:15:44.492 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.492 "dma_device_type": 2 00:15:44.492 } 00:15:44.492 ], 00:15:44.492 "driver_specific": { 00:15:44.492 "passthru": { 00:15:44.492 "name": "pt2", 00:15:44.492 "base_bdev_name": "malloc2" 00:15:44.492 } 00:15:44.492 } 00:15:44.492 }' 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:44.492 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.751 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.751 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:44.752 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.752 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.752 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:44.752 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:44.752 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:44.752 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:45.012 "name": "pt3", 00:15:45.012 "aliases": [ 00:15:45.012 "00000000-0000-0000-0000-000000000003" 00:15:45.012 ], 00:15:45.012 "product_name": "passthru", 00:15:45.012 "block_size": 512, 00:15:45.012 "num_blocks": 65536, 00:15:45.012 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:45.012 "assigned_rate_limits": { 00:15:45.012 "rw_ios_per_sec": 0, 00:15:45.012 "rw_mbytes_per_sec": 0, 00:15:45.012 "r_mbytes_per_sec": 0, 00:15:45.012 "w_mbytes_per_sec": 0 00:15:45.012 }, 00:15:45.012 "claimed": true, 00:15:45.012 "claim_type": "exclusive_write", 00:15:45.012 "zoned": false, 00:15:45.012 "supported_io_types": { 00:15:45.012 "read": true, 00:15:45.012 "write": true, 00:15:45.012 "unmap": true, 00:15:45.012 "flush": true, 00:15:45.012 "reset": true, 00:15:45.012 "nvme_admin": false, 00:15:45.012 "nvme_io": false, 00:15:45.012 "nvme_io_md": false, 00:15:45.012 "write_zeroes": true, 00:15:45.012 "zcopy": true, 00:15:45.012 "get_zone_info": false, 00:15:45.012 "zone_management": false, 00:15:45.012 "zone_append": false, 00:15:45.012 "compare": false, 00:15:45.012 "compare_and_write": false, 00:15:45.012 "abort": true, 00:15:45.012 "seek_hole": false, 00:15:45.012 "seek_data": false, 00:15:45.012 "copy": true, 00:15:45.012 "nvme_iov_md": false 00:15:45.012 }, 00:15:45.012 "memory_domains": [ 00:15:45.012 { 00:15:45.012 "dma_device_id": "system", 00:15:45.012 "dma_device_type": 1 00:15:45.012 }, 00:15:45.012 { 00:15:45.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:45.012 "dma_device_type": 2 00:15:45.012 } 00:15:45.012 ], 00:15:45.012 "driver_specific": { 00:15:45.012 "passthru": { 00:15:45.012 "name": "pt3", 00:15:45.012 "base_bdev_name": "malloc3" 00:15:45.012 } 00:15:45.012 } 00:15:45.012 }' 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:45.012 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:45.272 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:45.272 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:45.272 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:45.272 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:45.272 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:45.272 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:45.272 13:24:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:15:45.567 [2024-07-25 13:24:26.106990] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' af18889c-c32d-4b92-ad1a-6eae610b5de7 '!=' af18889c-c32d-4b92-ad1a-6eae610b5de7 ']' 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 920895 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 920895 ']' 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 920895 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 920895 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 920895' 00:15:45.567 killing process with pid 920895 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 920895 00:15:45.567 [2024-07-25 13:24:26.176117] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:45.567 [2024-07-25 13:24:26.176159] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:45.567 [2024-07-25 13:24:26.176198] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:45.567 [2024-07-25 13:24:26.176204] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25b24a0 name raid_bdev1, state offline 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 920895 00:15:45.567 [2024-07-25 13:24:26.191105] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:45.567 13:24:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:15:45.567 00:15:45.567 real 0m11.881s 00:15:45.568 user 0m21.805s 00:15:45.568 sys 0m1.812s 00:15:45.568 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:45.568 13:24:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.568 ************************************ 00:15:45.568 END TEST raid_superblock_test 00:15:45.568 ************************************ 00:15:45.851 13:24:26 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:15:45.851 13:24:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:45.851 13:24:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:45.851 13:24:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:45.851 ************************************ 00:15:45.851 START TEST raid_read_error_test 00:15:45.851 ************************************ 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 read 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.3LD9W9xUFN 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=923103 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 923103 /var/tmp/spdk-raid.sock 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 923103 ']' 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:45.851 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:45.851 13:24:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.851 [2024-07-25 13:24:26.459762] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:15:45.851 [2024-07-25 13:24:26.459816] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid923103 ] 00:15:45.851 [2024-07-25 13:24:26.551900] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:45.851 [2024-07-25 13:24:26.619837] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:46.110 [2024-07-25 13:24:26.665812] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:46.110 [2024-07-25 13:24:26.665836] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:46.681 13:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:46.681 13:24:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:46.681 13:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:46.681 13:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:46.940 BaseBdev1_malloc 00:15:46.940 13:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:46.940 true 00:15:46.941 13:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:47.201 [2024-07-25 13:24:27.857034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:47.201 [2024-07-25 13:24:27.857066] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.201 [2024-07-25 13:24:27.857079] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x158e2a0 00:15:47.201 [2024-07-25 13:24:27.857085] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.201 [2024-07-25 13:24:27.858380] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.201 [2024-07-25 13:24:27.858401] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:47.201 BaseBdev1 00:15:47.201 13:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:47.201 13:24:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:47.462 BaseBdev2_malloc 00:15:47.462 13:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:47.462 true 00:15:47.462 13:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:47.722 [2024-07-25 13:24:28.396384] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:47.722 [2024-07-25 13:24:28.396412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.722 [2024-07-25 13:24:28.396424] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x164d420 00:15:47.722 [2024-07-25 13:24:28.396430] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.722 [2024-07-25 13:24:28.397609] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.722 [2024-07-25 13:24:28.397628] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:47.722 BaseBdev2 00:15:47.722 13:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:47.722 13:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:47.982 BaseBdev3_malloc 00:15:47.982 13:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:48.242 true 00:15:48.242 13:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:48.242 [2024-07-25 13:24:28.971757] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:48.242 [2024-07-25 13:24:28.971787] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.242 [2024-07-25 13:24:28.971801] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x164ef70 00:15:48.242 [2024-07-25 13:24:28.971808] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.242 [2024-07-25 13:24:28.973002] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.242 [2024-07-25 13:24:28.973021] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:48.242 BaseBdev3 00:15:48.242 13:24:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:48.502 [2024-07-25 13:24:29.160344] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:48.502 [2024-07-25 13:24:29.161357] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:48.502 [2024-07-25 13:24:29.161411] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:48.502 [2024-07-25 13:24:29.161559] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1650cc0 00:15:48.502 [2024-07-25 13:24:29.161567] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:48.502 [2024-07-25 13:24:29.161717] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1651f70 00:15:48.502 [2024-07-25 13:24:29.161829] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1650cc0 00:15:48.502 [2024-07-25 13:24:29.161834] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1650cc0 00:15:48.503 [2024-07-25 13:24:29.161920] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.503 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.762 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.762 "name": "raid_bdev1", 00:15:48.762 "uuid": "9546aa38-ad8d-48de-a0c5-b757918cd549", 00:15:48.762 "strip_size_kb": 64, 00:15:48.762 "state": "online", 00:15:48.762 "raid_level": "concat", 00:15:48.762 "superblock": true, 00:15:48.762 "num_base_bdevs": 3, 00:15:48.762 "num_base_bdevs_discovered": 3, 00:15:48.762 "num_base_bdevs_operational": 3, 00:15:48.762 "base_bdevs_list": [ 00:15:48.762 { 00:15:48.762 "name": "BaseBdev1", 00:15:48.762 "uuid": "a02f069a-228d-53f7-bff0-eae2cbbdd1e5", 00:15:48.762 "is_configured": true, 00:15:48.762 "data_offset": 2048, 00:15:48.762 "data_size": 63488 00:15:48.762 }, 00:15:48.762 { 00:15:48.762 "name": "BaseBdev2", 00:15:48.762 "uuid": "f2de7f5b-ccd3-59bc-a20c-ddf6447bd577", 00:15:48.762 "is_configured": true, 00:15:48.762 "data_offset": 2048, 00:15:48.762 "data_size": 63488 00:15:48.762 }, 00:15:48.762 { 00:15:48.762 "name": "BaseBdev3", 00:15:48.762 "uuid": "69f157a6-f2f9-5301-8fda-277ac6daff3b", 00:15:48.762 "is_configured": true, 00:15:48.762 "data_offset": 2048, 00:15:48.762 "data_size": 63488 00:15:48.762 } 00:15:48.762 ] 00:15:48.762 }' 00:15:48.762 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.762 13:24:29 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.333 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:49.333 13:24:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:49.333 [2024-07-25 13:24:29.990675] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1657000 00:15:50.271 13:24:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:50.531 "name": "raid_bdev1", 00:15:50.531 "uuid": "9546aa38-ad8d-48de-a0c5-b757918cd549", 00:15:50.531 "strip_size_kb": 64, 00:15:50.531 "state": "online", 00:15:50.531 "raid_level": "concat", 00:15:50.531 "superblock": true, 00:15:50.531 "num_base_bdevs": 3, 00:15:50.531 "num_base_bdevs_discovered": 3, 00:15:50.531 "num_base_bdevs_operational": 3, 00:15:50.531 "base_bdevs_list": [ 00:15:50.531 { 00:15:50.531 "name": "BaseBdev1", 00:15:50.531 "uuid": "a02f069a-228d-53f7-bff0-eae2cbbdd1e5", 00:15:50.531 "is_configured": true, 00:15:50.531 "data_offset": 2048, 00:15:50.531 "data_size": 63488 00:15:50.531 }, 00:15:50.531 { 00:15:50.531 "name": "BaseBdev2", 00:15:50.531 "uuid": "f2de7f5b-ccd3-59bc-a20c-ddf6447bd577", 00:15:50.531 "is_configured": true, 00:15:50.531 "data_offset": 2048, 00:15:50.531 "data_size": 63488 00:15:50.531 }, 00:15:50.531 { 00:15:50.531 "name": "BaseBdev3", 00:15:50.531 "uuid": "69f157a6-f2f9-5301-8fda-277ac6daff3b", 00:15:50.531 "is_configured": true, 00:15:50.531 "data_offset": 2048, 00:15:50.531 "data_size": 63488 00:15:50.531 } 00:15:50.531 ] 00:15:50.531 }' 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:50.531 13:24:31 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.100 13:24:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:51.360 [2024-07-25 13:24:32.035830] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:51.360 [2024-07-25 13:24:32.035861] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:51.360 [2024-07-25 13:24:32.038445] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:51.360 [2024-07-25 13:24:32.038474] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:51.360 [2024-07-25 13:24:32.038498] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:51.360 [2024-07-25 13:24:32.038505] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1650cc0 name raid_bdev1, state offline 00:15:51.360 0 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 923103 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 923103 ']' 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 923103 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 923103 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 923103' 00:15:51.360 killing process with pid 923103 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 923103 00:15:51.360 [2024-07-25 13:24:32.121061] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:51.360 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 923103 00:15:51.360 [2024-07-25 13:24:32.132368] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.3LD9W9xUFN 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.49 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.49 != \0\.\0\0 ]] 00:15:51.621 00:15:51.621 real 0m5.875s 00:15:51.621 user 0m9.338s 00:15:51.621 sys 0m0.834s 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:51.621 13:24:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.621 ************************************ 00:15:51.621 END TEST raid_read_error_test 00:15:51.621 ************************************ 00:15:51.621 13:24:32 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:15:51.621 13:24:32 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:51.621 13:24:32 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:51.621 13:24:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:51.621 ************************************ 00:15:51.621 START TEST raid_write_error_test 00:15:51.621 ************************************ 00:15:51.621 13:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 3 write 00:15:51.621 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:15:51.621 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.4bZM5QY84a 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=924185 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 924185 /var/tmp/spdk-raid.sock 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 924185 ']' 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:51.622 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:51.622 13:24:32 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:51.622 [2024-07-25 13:24:32.408877] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:15:51.622 [2024-07-25 13:24:32.408935] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid924185 ] 00:15:51.882 [2024-07-25 13:24:32.502795] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:51.882 [2024-07-25 13:24:32.578430] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:51.882 [2024-07-25 13:24:32.626740] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:51.882 [2024-07-25 13:24:32.626764] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:52.820 13:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:52.820 13:24:33 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:15:52.820 13:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:52.820 13:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:15:52.820 BaseBdev1_malloc 00:15:52.820 13:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:15:53.079 true 00:15:53.079 13:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:15:53.079 [2024-07-25 13:24:33.822361] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:15:53.079 [2024-07-25 13:24:33.822395] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.079 [2024-07-25 13:24:33.822407] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ec02a0 00:15:53.079 [2024-07-25 13:24:33.822413] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.079 [2024-07-25 13:24:33.823731] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.079 [2024-07-25 13:24:33.823751] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:15:53.079 BaseBdev1 00:15:53.079 13:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:53.079 13:24:33 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:15:53.339 BaseBdev2_malloc 00:15:53.339 13:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:15:53.601 true 00:15:53.601 13:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:15:53.862 [2024-07-25 13:24:34.397465] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:15:53.862 [2024-07-25 13:24:34.397497] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:53.862 [2024-07-25 13:24:34.397509] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f7f420 00:15:53.862 [2024-07-25 13:24:34.397515] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:53.862 [2024-07-25 13:24:34.398730] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:53.862 [2024-07-25 13:24:34.398749] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:15:53.862 BaseBdev2 00:15:53.862 13:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:15:53.862 13:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:15:53.862 BaseBdev3_malloc 00:15:53.862 13:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:15:54.121 true 00:15:54.121 13:24:34 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:15:54.381 [2024-07-25 13:24:34.980537] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:15:54.381 [2024-07-25 13:24:34.980571] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:54.381 [2024-07-25 13:24:34.980583] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1f80f70 00:15:54.381 [2024-07-25 13:24:34.980590] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:54.381 [2024-07-25 13:24:34.981794] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:54.381 [2024-07-25 13:24:34.981817] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:15:54.381 BaseBdev3 00:15:54.381 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:15:54.381 [2024-07-25 13:24:35.169039] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:54.381 [2024-07-25 13:24:35.170025] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:54.381 [2024-07-25 13:24:35.170079] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:54.381 [2024-07-25 13:24:35.170223] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f82cc0 00:15:54.381 [2024-07-25 13:24:35.170230] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:15:54.381 [2024-07-25 13:24:35.170377] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f83f70 00:15:54.381 [2024-07-25 13:24:35.170487] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f82cc0 00:15:54.382 [2024-07-25 13:24:35.170492] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f82cc0 00:15:54.382 [2024-07-25 13:24:35.170583] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:54.641 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.641 "name": "raid_bdev1", 00:15:54.641 "uuid": "84ab6897-87e2-48a0-8998-29092e1e9674", 00:15:54.641 "strip_size_kb": 64, 00:15:54.641 "state": "online", 00:15:54.641 "raid_level": "concat", 00:15:54.641 "superblock": true, 00:15:54.641 "num_base_bdevs": 3, 00:15:54.641 "num_base_bdevs_discovered": 3, 00:15:54.641 "num_base_bdevs_operational": 3, 00:15:54.641 "base_bdevs_list": [ 00:15:54.641 { 00:15:54.641 "name": "BaseBdev1", 00:15:54.641 "uuid": "8a1825ae-8616-5f14-806e-30edbfa8d1cb", 00:15:54.641 "is_configured": true, 00:15:54.641 "data_offset": 2048, 00:15:54.641 "data_size": 63488 00:15:54.641 }, 00:15:54.641 { 00:15:54.641 "name": "BaseBdev2", 00:15:54.641 "uuid": "9bd0f850-6d81-5ef6-8197-37baacef1602", 00:15:54.641 "is_configured": true, 00:15:54.641 "data_offset": 2048, 00:15:54.642 "data_size": 63488 00:15:54.642 }, 00:15:54.642 { 00:15:54.642 "name": "BaseBdev3", 00:15:54.642 "uuid": "5d9afd59-0065-5522-bd5d-dada1034d2b0", 00:15:54.642 "is_configured": true, 00:15:54.642 "data_offset": 2048, 00:15:54.642 "data_size": 63488 00:15:54.642 } 00:15:54.642 ] 00:15:54.642 }' 00:15:54.642 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.642 13:24:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:55.209 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:15:55.209 13:24:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:15:55.210 [2024-07-25 13:24:35.991347] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f89000 00:15:56.147 13:24:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.407 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:56.667 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.667 "name": "raid_bdev1", 00:15:56.667 "uuid": "84ab6897-87e2-48a0-8998-29092e1e9674", 00:15:56.667 "strip_size_kb": 64, 00:15:56.667 "state": "online", 00:15:56.667 "raid_level": "concat", 00:15:56.667 "superblock": true, 00:15:56.667 "num_base_bdevs": 3, 00:15:56.667 "num_base_bdevs_discovered": 3, 00:15:56.667 "num_base_bdevs_operational": 3, 00:15:56.667 "base_bdevs_list": [ 00:15:56.667 { 00:15:56.667 "name": "BaseBdev1", 00:15:56.667 "uuid": "8a1825ae-8616-5f14-806e-30edbfa8d1cb", 00:15:56.667 "is_configured": true, 00:15:56.667 "data_offset": 2048, 00:15:56.667 "data_size": 63488 00:15:56.667 }, 00:15:56.667 { 00:15:56.667 "name": "BaseBdev2", 00:15:56.667 "uuid": "9bd0f850-6d81-5ef6-8197-37baacef1602", 00:15:56.667 "is_configured": true, 00:15:56.667 "data_offset": 2048, 00:15:56.667 "data_size": 63488 00:15:56.667 }, 00:15:56.667 { 00:15:56.668 "name": "BaseBdev3", 00:15:56.668 "uuid": "5d9afd59-0065-5522-bd5d-dada1034d2b0", 00:15:56.668 "is_configured": true, 00:15:56.668 "data_offset": 2048, 00:15:56.668 "data_size": 63488 00:15:56.668 } 00:15:56.668 ] 00:15:56.668 }' 00:15:56.668 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.668 13:24:37 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.238 13:24:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:57.498 [2024-07-25 13:24:38.059275] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:57.498 [2024-07-25 13:24:38.059304] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:57.498 [2024-07-25 13:24:38.061894] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:57.498 [2024-07-25 13:24:38.061921] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:57.498 [2024-07-25 13:24:38.061944] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:57.498 [2024-07-25 13:24:38.061950] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f82cc0 name raid_bdev1, state offline 00:15:57.498 0 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 924185 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 924185 ']' 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 924185 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 924185 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 924185' 00:15:57.498 killing process with pid 924185 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 924185 00:15:57.498 [2024-07-25 13:24:38.126665] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 924185 00:15:57.498 [2024-07-25 13:24:38.137964] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.4bZM5QY84a 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.49 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.49 != \0\.\0\0 ]] 00:15:57.498 00:15:57.498 real 0m5.934s 00:15:57.498 user 0m9.459s 00:15:57.498 sys 0m0.837s 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:15:57.498 13:24:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.498 ************************************ 00:15:57.498 END TEST raid_write_error_test 00:15:57.498 ************************************ 00:15:57.758 13:24:38 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:15:57.758 13:24:38 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:15:57.758 13:24:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:15:57.758 13:24:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:15:57.758 13:24:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:57.758 ************************************ 00:15:57.758 START TEST raid_state_function_test 00:15:57.758 ************************************ 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 false 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=925424 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 925424' 00:15:57.758 Process raid pid: 925424 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 925424 /var/tmp/spdk-raid.sock 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 925424 ']' 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:57.758 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:15:57.758 13:24:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.758 [2024-07-25 13:24:38.411271] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:15:57.758 [2024-07-25 13:24:38.411315] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:57.758 [2024-07-25 13:24:38.497554] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:58.018 [2024-07-25 13:24:38.560153] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:15:58.018 [2024-07-25 13:24:38.599486] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:58.018 [2024-07-25 13:24:38.599508] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:58.586 13:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:15:58.586 13:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:15:58.586 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:58.846 [2024-07-25 13:24:39.438705] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:58.846 [2024-07-25 13:24:39.438734] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:58.846 [2024-07-25 13:24:39.438740] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:58.846 [2024-07-25 13:24:39.438746] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:58.846 [2024-07-25 13:24:39.438753] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:58.846 [2024-07-25 13:24:39.438759] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:58.846 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:59.106 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:59.106 "name": "Existed_Raid", 00:15:59.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.106 "strip_size_kb": 0, 00:15:59.106 "state": "configuring", 00:15:59.106 "raid_level": "raid1", 00:15:59.106 "superblock": false, 00:15:59.106 "num_base_bdevs": 3, 00:15:59.106 "num_base_bdevs_discovered": 0, 00:15:59.106 "num_base_bdevs_operational": 3, 00:15:59.106 "base_bdevs_list": [ 00:15:59.106 { 00:15:59.106 "name": "BaseBdev1", 00:15:59.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.106 "is_configured": false, 00:15:59.106 "data_offset": 0, 00:15:59.106 "data_size": 0 00:15:59.106 }, 00:15:59.106 { 00:15:59.106 "name": "BaseBdev2", 00:15:59.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.106 "is_configured": false, 00:15:59.106 "data_offset": 0, 00:15:59.106 "data_size": 0 00:15:59.106 }, 00:15:59.106 { 00:15:59.106 "name": "BaseBdev3", 00:15:59.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:59.106 "is_configured": false, 00:15:59.106 "data_offset": 0, 00:15:59.106 "data_size": 0 00:15:59.106 } 00:15:59.106 ] 00:15:59.106 }' 00:15:59.106 13:24:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:59.106 13:24:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:59.675 13:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:59.675 [2024-07-25 13:24:40.364976] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:59.675 [2024-07-25 13:24:40.365001] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fb26d0 name Existed_Raid, state configuring 00:15:59.676 13:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:59.934 [2024-07-25 13:24:40.561473] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:59.934 [2024-07-25 13:24:40.561496] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:59.934 [2024-07-25 13:24:40.561501] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:59.934 [2024-07-25 13:24:40.561507] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:59.935 [2024-07-25 13:24:40.561511] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:59.935 [2024-07-25 13:24:40.561516] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:59.935 13:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:00.194 [2024-07-25 13:24:40.756360] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:00.194 BaseBdev1 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:00.194 13:24:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:00.454 [ 00:16:00.454 { 00:16:00.454 "name": "BaseBdev1", 00:16:00.454 "aliases": [ 00:16:00.454 "f82c6584-7d1d-4011-bf94-c227911f0a52" 00:16:00.454 ], 00:16:00.454 "product_name": "Malloc disk", 00:16:00.454 "block_size": 512, 00:16:00.454 "num_blocks": 65536, 00:16:00.454 "uuid": "f82c6584-7d1d-4011-bf94-c227911f0a52", 00:16:00.454 "assigned_rate_limits": { 00:16:00.454 "rw_ios_per_sec": 0, 00:16:00.454 "rw_mbytes_per_sec": 0, 00:16:00.454 "r_mbytes_per_sec": 0, 00:16:00.454 "w_mbytes_per_sec": 0 00:16:00.454 }, 00:16:00.454 "claimed": true, 00:16:00.454 "claim_type": "exclusive_write", 00:16:00.454 "zoned": false, 00:16:00.454 "supported_io_types": { 00:16:00.454 "read": true, 00:16:00.454 "write": true, 00:16:00.454 "unmap": true, 00:16:00.454 "flush": true, 00:16:00.454 "reset": true, 00:16:00.454 "nvme_admin": false, 00:16:00.454 "nvme_io": false, 00:16:00.454 "nvme_io_md": false, 00:16:00.454 "write_zeroes": true, 00:16:00.454 "zcopy": true, 00:16:00.454 "get_zone_info": false, 00:16:00.454 "zone_management": false, 00:16:00.454 "zone_append": false, 00:16:00.454 "compare": false, 00:16:00.454 "compare_and_write": false, 00:16:00.454 "abort": true, 00:16:00.454 "seek_hole": false, 00:16:00.454 "seek_data": false, 00:16:00.454 "copy": true, 00:16:00.454 "nvme_iov_md": false 00:16:00.454 }, 00:16:00.454 "memory_domains": [ 00:16:00.454 { 00:16:00.454 "dma_device_id": "system", 00:16:00.454 "dma_device_type": 1 00:16:00.454 }, 00:16:00.454 { 00:16:00.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:00.454 "dma_device_type": 2 00:16:00.454 } 00:16:00.454 ], 00:16:00.454 "driver_specific": {} 00:16:00.454 } 00:16:00.454 ] 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:00.454 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:00.715 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.715 "name": "Existed_Raid", 00:16:00.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.715 "strip_size_kb": 0, 00:16:00.715 "state": "configuring", 00:16:00.715 "raid_level": "raid1", 00:16:00.715 "superblock": false, 00:16:00.715 "num_base_bdevs": 3, 00:16:00.715 "num_base_bdevs_discovered": 1, 00:16:00.715 "num_base_bdevs_operational": 3, 00:16:00.715 "base_bdevs_list": [ 00:16:00.715 { 00:16:00.715 "name": "BaseBdev1", 00:16:00.715 "uuid": "f82c6584-7d1d-4011-bf94-c227911f0a52", 00:16:00.715 "is_configured": true, 00:16:00.715 "data_offset": 0, 00:16:00.715 "data_size": 65536 00:16:00.715 }, 00:16:00.715 { 00:16:00.715 "name": "BaseBdev2", 00:16:00.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.715 "is_configured": false, 00:16:00.715 "data_offset": 0, 00:16:00.715 "data_size": 0 00:16:00.715 }, 00:16:00.715 { 00:16:00.715 "name": "BaseBdev3", 00:16:00.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.715 "is_configured": false, 00:16:00.715 "data_offset": 0, 00:16:00.715 "data_size": 0 00:16:00.715 } 00:16:00.715 ] 00:16:00.715 }' 00:16:00.715 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.715 13:24:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:01.285 13:24:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:01.285 [2024-07-25 13:24:42.071700] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:01.285 [2024-07-25 13:24:42.071738] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fb1fa0 name Existed_Raid, state configuring 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:01.545 [2024-07-25 13:24:42.268213] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:01.545 [2024-07-25 13:24:42.269330] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:01.545 [2024-07-25 13:24:42.269356] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:01.545 [2024-07-25 13:24:42.269362] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:01.545 [2024-07-25 13:24:42.269368] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.545 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:01.805 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:01.805 "name": "Existed_Raid", 00:16:01.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.805 "strip_size_kb": 0, 00:16:01.805 "state": "configuring", 00:16:01.805 "raid_level": "raid1", 00:16:01.805 "superblock": false, 00:16:01.805 "num_base_bdevs": 3, 00:16:01.805 "num_base_bdevs_discovered": 1, 00:16:01.805 "num_base_bdevs_operational": 3, 00:16:01.805 "base_bdevs_list": [ 00:16:01.805 { 00:16:01.805 "name": "BaseBdev1", 00:16:01.805 "uuid": "f82c6584-7d1d-4011-bf94-c227911f0a52", 00:16:01.805 "is_configured": true, 00:16:01.805 "data_offset": 0, 00:16:01.805 "data_size": 65536 00:16:01.805 }, 00:16:01.805 { 00:16:01.805 "name": "BaseBdev2", 00:16:01.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.805 "is_configured": false, 00:16:01.805 "data_offset": 0, 00:16:01.805 "data_size": 0 00:16:01.805 }, 00:16:01.805 { 00:16:01.805 "name": "BaseBdev3", 00:16:01.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:01.805 "is_configured": false, 00:16:01.805 "data_offset": 0, 00:16:01.805 "data_size": 0 00:16:01.805 } 00:16:01.805 ] 00:16:01.805 }' 00:16:01.805 13:24:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:01.805 13:24:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.374 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:02.634 [2024-07-25 13:24:43.223641] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:02.634 BaseBdev2 00:16:02.634 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:02.634 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:02.634 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:02.634 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:02.634 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:02.634 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:02.634 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:02.899 [ 00:16:02.899 { 00:16:02.899 "name": "BaseBdev2", 00:16:02.899 "aliases": [ 00:16:02.899 "ce52f121-2ea5-4469-86d0-443742b50d51" 00:16:02.899 ], 00:16:02.899 "product_name": "Malloc disk", 00:16:02.899 "block_size": 512, 00:16:02.899 "num_blocks": 65536, 00:16:02.899 "uuid": "ce52f121-2ea5-4469-86d0-443742b50d51", 00:16:02.899 "assigned_rate_limits": { 00:16:02.899 "rw_ios_per_sec": 0, 00:16:02.899 "rw_mbytes_per_sec": 0, 00:16:02.899 "r_mbytes_per_sec": 0, 00:16:02.899 "w_mbytes_per_sec": 0 00:16:02.899 }, 00:16:02.899 "claimed": true, 00:16:02.899 "claim_type": "exclusive_write", 00:16:02.899 "zoned": false, 00:16:02.899 "supported_io_types": { 00:16:02.899 "read": true, 00:16:02.899 "write": true, 00:16:02.899 "unmap": true, 00:16:02.899 "flush": true, 00:16:02.899 "reset": true, 00:16:02.899 "nvme_admin": false, 00:16:02.899 "nvme_io": false, 00:16:02.899 "nvme_io_md": false, 00:16:02.899 "write_zeroes": true, 00:16:02.899 "zcopy": true, 00:16:02.899 "get_zone_info": false, 00:16:02.899 "zone_management": false, 00:16:02.899 "zone_append": false, 00:16:02.899 "compare": false, 00:16:02.899 "compare_and_write": false, 00:16:02.899 "abort": true, 00:16:02.899 "seek_hole": false, 00:16:02.899 "seek_data": false, 00:16:02.899 "copy": true, 00:16:02.899 "nvme_iov_md": false 00:16:02.899 }, 00:16:02.899 "memory_domains": [ 00:16:02.899 { 00:16:02.899 "dma_device_id": "system", 00:16:02.899 "dma_device_type": 1 00:16:02.899 }, 00:16:02.899 { 00:16:02.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:02.899 "dma_device_type": 2 00:16:02.899 } 00:16:02.899 ], 00:16:02.899 "driver_specific": {} 00:16:02.899 } 00:16:02.899 ] 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.899 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.900 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.900 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:03.158 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.158 "name": "Existed_Raid", 00:16:03.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.158 "strip_size_kb": 0, 00:16:03.158 "state": "configuring", 00:16:03.158 "raid_level": "raid1", 00:16:03.158 "superblock": false, 00:16:03.158 "num_base_bdevs": 3, 00:16:03.158 "num_base_bdevs_discovered": 2, 00:16:03.158 "num_base_bdevs_operational": 3, 00:16:03.158 "base_bdevs_list": [ 00:16:03.158 { 00:16:03.158 "name": "BaseBdev1", 00:16:03.158 "uuid": "f82c6584-7d1d-4011-bf94-c227911f0a52", 00:16:03.158 "is_configured": true, 00:16:03.158 "data_offset": 0, 00:16:03.158 "data_size": 65536 00:16:03.158 }, 00:16:03.158 { 00:16:03.158 "name": "BaseBdev2", 00:16:03.158 "uuid": "ce52f121-2ea5-4469-86d0-443742b50d51", 00:16:03.158 "is_configured": true, 00:16:03.158 "data_offset": 0, 00:16:03.158 "data_size": 65536 00:16:03.158 }, 00:16:03.158 { 00:16:03.158 "name": "BaseBdev3", 00:16:03.158 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.158 "is_configured": false, 00:16:03.158 "data_offset": 0, 00:16:03.158 "data_size": 0 00:16:03.158 } 00:16:03.158 ] 00:16:03.158 }' 00:16:03.158 13:24:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.158 13:24:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:03.726 13:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:03.726 [2024-07-25 13:24:44.499889] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:03.726 [2024-07-25 13:24:44.499913] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fb2ea0 00:16:03.726 [2024-07-25 13:24:44.499918] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:03.726 [2024-07-25 13:24:44.500091] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fb2b70 00:16:03.726 [2024-07-25 13:24:44.500187] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fb2ea0 00:16:03.726 [2024-07-25 13:24:44.500193] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fb2ea0 00:16:03.726 [2024-07-25 13:24:44.500309] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:03.726 BaseBdev3 00:16:03.985 13:24:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:03.985 13:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:03.985 13:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:03.985 13:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:03.985 13:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:03.985 13:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:03.985 13:24:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:04.552 13:24:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:04.552 [ 00:16:04.552 { 00:16:04.552 "name": "BaseBdev3", 00:16:04.552 "aliases": [ 00:16:04.552 "c32274cb-51e5-4bd5-ac51-a94eda5d5eea" 00:16:04.552 ], 00:16:04.552 "product_name": "Malloc disk", 00:16:04.552 "block_size": 512, 00:16:04.552 "num_blocks": 65536, 00:16:04.552 "uuid": "c32274cb-51e5-4bd5-ac51-a94eda5d5eea", 00:16:04.552 "assigned_rate_limits": { 00:16:04.552 "rw_ios_per_sec": 0, 00:16:04.552 "rw_mbytes_per_sec": 0, 00:16:04.552 "r_mbytes_per_sec": 0, 00:16:04.552 "w_mbytes_per_sec": 0 00:16:04.552 }, 00:16:04.552 "claimed": true, 00:16:04.552 "claim_type": "exclusive_write", 00:16:04.552 "zoned": false, 00:16:04.552 "supported_io_types": { 00:16:04.552 "read": true, 00:16:04.552 "write": true, 00:16:04.552 "unmap": true, 00:16:04.552 "flush": true, 00:16:04.552 "reset": true, 00:16:04.552 "nvme_admin": false, 00:16:04.552 "nvme_io": false, 00:16:04.552 "nvme_io_md": false, 00:16:04.552 "write_zeroes": true, 00:16:04.552 "zcopy": true, 00:16:04.552 "get_zone_info": false, 00:16:04.552 "zone_management": false, 00:16:04.553 "zone_append": false, 00:16:04.553 "compare": false, 00:16:04.553 "compare_and_write": false, 00:16:04.553 "abort": true, 00:16:04.553 "seek_hole": false, 00:16:04.553 "seek_data": false, 00:16:04.553 "copy": true, 00:16:04.553 "nvme_iov_md": false 00:16:04.553 }, 00:16:04.553 "memory_domains": [ 00:16:04.553 { 00:16:04.553 "dma_device_id": "system", 00:16:04.553 "dma_device_type": 1 00:16:04.553 }, 00:16:04.553 { 00:16:04.553 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:04.553 "dma_device_type": 2 00:16:04.553 } 00:16:04.553 ], 00:16:04.553 "driver_specific": {} 00:16:04.553 } 00:16:04.553 ] 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.553 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:04.812 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:04.812 "name": "Existed_Raid", 00:16:04.812 "uuid": "556765aa-dea4-4c3e-9ce6-9612b8bd1a04", 00:16:04.812 "strip_size_kb": 0, 00:16:04.812 "state": "online", 00:16:04.812 "raid_level": "raid1", 00:16:04.812 "superblock": false, 00:16:04.812 "num_base_bdevs": 3, 00:16:04.812 "num_base_bdevs_discovered": 3, 00:16:04.812 "num_base_bdevs_operational": 3, 00:16:04.812 "base_bdevs_list": [ 00:16:04.812 { 00:16:04.812 "name": "BaseBdev1", 00:16:04.812 "uuid": "f82c6584-7d1d-4011-bf94-c227911f0a52", 00:16:04.812 "is_configured": true, 00:16:04.812 "data_offset": 0, 00:16:04.812 "data_size": 65536 00:16:04.812 }, 00:16:04.812 { 00:16:04.812 "name": "BaseBdev2", 00:16:04.812 "uuid": "ce52f121-2ea5-4469-86d0-443742b50d51", 00:16:04.812 "is_configured": true, 00:16:04.812 "data_offset": 0, 00:16:04.812 "data_size": 65536 00:16:04.812 }, 00:16:04.812 { 00:16:04.812 "name": "BaseBdev3", 00:16:04.812 "uuid": "c32274cb-51e5-4bd5-ac51-a94eda5d5eea", 00:16:04.812 "is_configured": true, 00:16:04.812 "data_offset": 0, 00:16:04.812 "data_size": 65536 00:16:04.812 } 00:16:04.812 ] 00:16:04.812 }' 00:16:04.812 13:24:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:04.812 13:24:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:05.380 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:05.639 [2024-07-25 13:24:46.196431] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:05.639 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:05.639 "name": "Existed_Raid", 00:16:05.639 "aliases": [ 00:16:05.639 "556765aa-dea4-4c3e-9ce6-9612b8bd1a04" 00:16:05.639 ], 00:16:05.639 "product_name": "Raid Volume", 00:16:05.639 "block_size": 512, 00:16:05.639 "num_blocks": 65536, 00:16:05.639 "uuid": "556765aa-dea4-4c3e-9ce6-9612b8bd1a04", 00:16:05.639 "assigned_rate_limits": { 00:16:05.639 "rw_ios_per_sec": 0, 00:16:05.639 "rw_mbytes_per_sec": 0, 00:16:05.639 "r_mbytes_per_sec": 0, 00:16:05.639 "w_mbytes_per_sec": 0 00:16:05.639 }, 00:16:05.639 "claimed": false, 00:16:05.639 "zoned": false, 00:16:05.639 "supported_io_types": { 00:16:05.639 "read": true, 00:16:05.639 "write": true, 00:16:05.639 "unmap": false, 00:16:05.639 "flush": false, 00:16:05.639 "reset": true, 00:16:05.639 "nvme_admin": false, 00:16:05.639 "nvme_io": false, 00:16:05.639 "nvme_io_md": false, 00:16:05.639 "write_zeroes": true, 00:16:05.639 "zcopy": false, 00:16:05.639 "get_zone_info": false, 00:16:05.639 "zone_management": false, 00:16:05.639 "zone_append": false, 00:16:05.639 "compare": false, 00:16:05.639 "compare_and_write": false, 00:16:05.639 "abort": false, 00:16:05.639 "seek_hole": false, 00:16:05.639 "seek_data": false, 00:16:05.639 "copy": false, 00:16:05.639 "nvme_iov_md": false 00:16:05.639 }, 00:16:05.639 "memory_domains": [ 00:16:05.639 { 00:16:05.639 "dma_device_id": "system", 00:16:05.639 "dma_device_type": 1 00:16:05.639 }, 00:16:05.639 { 00:16:05.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.639 "dma_device_type": 2 00:16:05.639 }, 00:16:05.639 { 00:16:05.639 "dma_device_id": "system", 00:16:05.639 "dma_device_type": 1 00:16:05.639 }, 00:16:05.639 { 00:16:05.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.639 "dma_device_type": 2 00:16:05.639 }, 00:16:05.639 { 00:16:05.639 "dma_device_id": "system", 00:16:05.639 "dma_device_type": 1 00:16:05.639 }, 00:16:05.639 { 00:16:05.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.639 "dma_device_type": 2 00:16:05.639 } 00:16:05.639 ], 00:16:05.639 "driver_specific": { 00:16:05.639 "raid": { 00:16:05.639 "uuid": "556765aa-dea4-4c3e-9ce6-9612b8bd1a04", 00:16:05.639 "strip_size_kb": 0, 00:16:05.639 "state": "online", 00:16:05.639 "raid_level": "raid1", 00:16:05.639 "superblock": false, 00:16:05.639 "num_base_bdevs": 3, 00:16:05.639 "num_base_bdevs_discovered": 3, 00:16:05.639 "num_base_bdevs_operational": 3, 00:16:05.639 "base_bdevs_list": [ 00:16:05.639 { 00:16:05.639 "name": "BaseBdev1", 00:16:05.639 "uuid": "f82c6584-7d1d-4011-bf94-c227911f0a52", 00:16:05.639 "is_configured": true, 00:16:05.639 "data_offset": 0, 00:16:05.639 "data_size": 65536 00:16:05.639 }, 00:16:05.639 { 00:16:05.640 "name": "BaseBdev2", 00:16:05.640 "uuid": "ce52f121-2ea5-4469-86d0-443742b50d51", 00:16:05.640 "is_configured": true, 00:16:05.640 "data_offset": 0, 00:16:05.640 "data_size": 65536 00:16:05.640 }, 00:16:05.640 { 00:16:05.640 "name": "BaseBdev3", 00:16:05.640 "uuid": "c32274cb-51e5-4bd5-ac51-a94eda5d5eea", 00:16:05.640 "is_configured": true, 00:16:05.640 "data_offset": 0, 00:16:05.640 "data_size": 65536 00:16:05.640 } 00:16:05.640 ] 00:16:05.640 } 00:16:05.640 } 00:16:05.640 }' 00:16:05.640 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:05.640 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:05.640 BaseBdev2 00:16:05.640 BaseBdev3' 00:16:05.640 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:05.640 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:05.640 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:05.899 "name": "BaseBdev1", 00:16:05.899 "aliases": [ 00:16:05.899 "f82c6584-7d1d-4011-bf94-c227911f0a52" 00:16:05.899 ], 00:16:05.899 "product_name": "Malloc disk", 00:16:05.899 "block_size": 512, 00:16:05.899 "num_blocks": 65536, 00:16:05.899 "uuid": "f82c6584-7d1d-4011-bf94-c227911f0a52", 00:16:05.899 "assigned_rate_limits": { 00:16:05.899 "rw_ios_per_sec": 0, 00:16:05.899 "rw_mbytes_per_sec": 0, 00:16:05.899 "r_mbytes_per_sec": 0, 00:16:05.899 "w_mbytes_per_sec": 0 00:16:05.899 }, 00:16:05.899 "claimed": true, 00:16:05.899 "claim_type": "exclusive_write", 00:16:05.899 "zoned": false, 00:16:05.899 "supported_io_types": { 00:16:05.899 "read": true, 00:16:05.899 "write": true, 00:16:05.899 "unmap": true, 00:16:05.899 "flush": true, 00:16:05.899 "reset": true, 00:16:05.899 "nvme_admin": false, 00:16:05.899 "nvme_io": false, 00:16:05.899 "nvme_io_md": false, 00:16:05.899 "write_zeroes": true, 00:16:05.899 "zcopy": true, 00:16:05.899 "get_zone_info": false, 00:16:05.899 "zone_management": false, 00:16:05.899 "zone_append": false, 00:16:05.899 "compare": false, 00:16:05.899 "compare_and_write": false, 00:16:05.899 "abort": true, 00:16:05.899 "seek_hole": false, 00:16:05.899 "seek_data": false, 00:16:05.899 "copy": true, 00:16:05.899 "nvme_iov_md": false 00:16:05.899 }, 00:16:05.899 "memory_domains": [ 00:16:05.899 { 00:16:05.899 "dma_device_id": "system", 00:16:05.899 "dma_device_type": 1 00:16:05.899 }, 00:16:05.899 { 00:16:05.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:05.899 "dma_device_type": 2 00:16:05.899 } 00:16:05.899 ], 00:16:05.899 "driver_specific": {} 00:16:05.899 }' 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:05.899 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.158 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.158 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:06.158 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.158 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:06.418 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:06.418 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:06.418 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:06.418 13:24:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:06.987 "name": "BaseBdev2", 00:16:06.987 "aliases": [ 00:16:06.987 "ce52f121-2ea5-4469-86d0-443742b50d51" 00:16:06.987 ], 00:16:06.987 "product_name": "Malloc disk", 00:16:06.987 "block_size": 512, 00:16:06.987 "num_blocks": 65536, 00:16:06.987 "uuid": "ce52f121-2ea5-4469-86d0-443742b50d51", 00:16:06.987 "assigned_rate_limits": { 00:16:06.987 "rw_ios_per_sec": 0, 00:16:06.987 "rw_mbytes_per_sec": 0, 00:16:06.987 "r_mbytes_per_sec": 0, 00:16:06.987 "w_mbytes_per_sec": 0 00:16:06.987 }, 00:16:06.987 "claimed": true, 00:16:06.987 "claim_type": "exclusive_write", 00:16:06.987 "zoned": false, 00:16:06.987 "supported_io_types": { 00:16:06.987 "read": true, 00:16:06.987 "write": true, 00:16:06.987 "unmap": true, 00:16:06.987 "flush": true, 00:16:06.987 "reset": true, 00:16:06.987 "nvme_admin": false, 00:16:06.987 "nvme_io": false, 00:16:06.987 "nvme_io_md": false, 00:16:06.987 "write_zeroes": true, 00:16:06.987 "zcopy": true, 00:16:06.987 "get_zone_info": false, 00:16:06.987 "zone_management": false, 00:16:06.987 "zone_append": false, 00:16:06.987 "compare": false, 00:16:06.987 "compare_and_write": false, 00:16:06.987 "abort": true, 00:16:06.987 "seek_hole": false, 00:16:06.987 "seek_data": false, 00:16:06.987 "copy": true, 00:16:06.987 "nvme_iov_md": false 00:16:06.987 }, 00:16:06.987 "memory_domains": [ 00:16:06.987 { 00:16:06.987 "dma_device_id": "system", 00:16:06.987 "dma_device_type": 1 00:16:06.987 }, 00:16:06.987 { 00:16:06.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:06.987 "dma_device_type": 2 00:16:06.987 } 00:16:06.987 ], 00:16:06.987 "driver_specific": {} 00:16:06.987 }' 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:06.987 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.247 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.247 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.247 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.247 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.247 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:07.247 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:07.247 13:24:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:07.505 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:07.505 "name": "BaseBdev3", 00:16:07.505 "aliases": [ 00:16:07.505 "c32274cb-51e5-4bd5-ac51-a94eda5d5eea" 00:16:07.505 ], 00:16:07.505 "product_name": "Malloc disk", 00:16:07.505 "block_size": 512, 00:16:07.505 "num_blocks": 65536, 00:16:07.505 "uuid": "c32274cb-51e5-4bd5-ac51-a94eda5d5eea", 00:16:07.505 "assigned_rate_limits": { 00:16:07.505 "rw_ios_per_sec": 0, 00:16:07.505 "rw_mbytes_per_sec": 0, 00:16:07.505 "r_mbytes_per_sec": 0, 00:16:07.505 "w_mbytes_per_sec": 0 00:16:07.505 }, 00:16:07.505 "claimed": true, 00:16:07.505 "claim_type": "exclusive_write", 00:16:07.505 "zoned": false, 00:16:07.505 "supported_io_types": { 00:16:07.505 "read": true, 00:16:07.505 "write": true, 00:16:07.505 "unmap": true, 00:16:07.505 "flush": true, 00:16:07.505 "reset": true, 00:16:07.505 "nvme_admin": false, 00:16:07.505 "nvme_io": false, 00:16:07.505 "nvme_io_md": false, 00:16:07.505 "write_zeroes": true, 00:16:07.505 "zcopy": true, 00:16:07.505 "get_zone_info": false, 00:16:07.505 "zone_management": false, 00:16:07.505 "zone_append": false, 00:16:07.505 "compare": false, 00:16:07.505 "compare_and_write": false, 00:16:07.505 "abort": true, 00:16:07.505 "seek_hole": false, 00:16:07.505 "seek_data": false, 00:16:07.505 "copy": true, 00:16:07.505 "nvme_iov_md": false 00:16:07.505 }, 00:16:07.505 "memory_domains": [ 00:16:07.505 { 00:16:07.505 "dma_device_id": "system", 00:16:07.505 "dma_device_type": 1 00:16:07.505 }, 00:16:07.505 { 00:16:07.505 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:07.505 "dma_device_type": 2 00:16:07.505 } 00:16:07.505 ], 00:16:07.505 "driver_specific": {} 00:16:07.505 }' 00:16:07.505 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.505 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:07.505 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:07.506 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.506 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:07.765 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:08.023 [2024-07-25 13:24:48.722631] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:08.024 13:24:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:08.592 13:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:08.592 "name": "Existed_Raid", 00:16:08.592 "uuid": "556765aa-dea4-4c3e-9ce6-9612b8bd1a04", 00:16:08.592 "strip_size_kb": 0, 00:16:08.592 "state": "online", 00:16:08.592 "raid_level": "raid1", 00:16:08.592 "superblock": false, 00:16:08.592 "num_base_bdevs": 3, 00:16:08.592 "num_base_bdevs_discovered": 2, 00:16:08.592 "num_base_bdevs_operational": 2, 00:16:08.592 "base_bdevs_list": [ 00:16:08.592 { 00:16:08.592 "name": null, 00:16:08.592 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:08.592 "is_configured": false, 00:16:08.592 "data_offset": 0, 00:16:08.592 "data_size": 65536 00:16:08.592 }, 00:16:08.592 { 00:16:08.592 "name": "BaseBdev2", 00:16:08.592 "uuid": "ce52f121-2ea5-4469-86d0-443742b50d51", 00:16:08.592 "is_configured": true, 00:16:08.592 "data_offset": 0, 00:16:08.592 "data_size": 65536 00:16:08.592 }, 00:16:08.592 { 00:16:08.592 "name": "BaseBdev3", 00:16:08.592 "uuid": "c32274cb-51e5-4bd5-ac51-a94eda5d5eea", 00:16:08.592 "is_configured": true, 00:16:08.592 "data_offset": 0, 00:16:08.592 "data_size": 65536 00:16:08.592 } 00:16:08.592 ] 00:16:08.592 }' 00:16:08.592 13:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:08.592 13:24:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:09.160 13:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:09.161 13:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.161 13:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.161 13:24:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:09.420 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:09.420 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:09.420 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:09.680 [2024-07-25 13:24:50.302658] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:09.680 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:09.680 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:09.680 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:09.680 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:09.980 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:09.980 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:09.980 13:24:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:10.266 [2024-07-25 13:24:51.022258] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:10.266 [2024-07-25 13:24:51.022317] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:10.266 [2024-07-25 13:24:51.028235] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:10.266 [2024-07-25 13:24:51.028259] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:10.266 [2024-07-25 13:24:51.028265] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fb2ea0 name Existed_Raid, state offline 00:16:10.524 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:10.524 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:10.524 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:10.524 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:10.525 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:10.525 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:10.525 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:10.525 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:10.525 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:10.525 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:10.783 BaseBdev2 00:16:10.783 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:10.783 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:10.783 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:10.783 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:10.783 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:10.783 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:10.783 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.042 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:11.301 [ 00:16:11.301 { 00:16:11.301 "name": "BaseBdev2", 00:16:11.301 "aliases": [ 00:16:11.301 "26d669d7-c904-4b19-9a7b-cfc2d6d57992" 00:16:11.301 ], 00:16:11.301 "product_name": "Malloc disk", 00:16:11.301 "block_size": 512, 00:16:11.301 "num_blocks": 65536, 00:16:11.301 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:11.301 "assigned_rate_limits": { 00:16:11.301 "rw_ios_per_sec": 0, 00:16:11.301 "rw_mbytes_per_sec": 0, 00:16:11.301 "r_mbytes_per_sec": 0, 00:16:11.301 "w_mbytes_per_sec": 0 00:16:11.301 }, 00:16:11.301 "claimed": false, 00:16:11.301 "zoned": false, 00:16:11.301 "supported_io_types": { 00:16:11.301 "read": true, 00:16:11.301 "write": true, 00:16:11.301 "unmap": true, 00:16:11.301 "flush": true, 00:16:11.301 "reset": true, 00:16:11.301 "nvme_admin": false, 00:16:11.301 "nvme_io": false, 00:16:11.301 "nvme_io_md": false, 00:16:11.301 "write_zeroes": true, 00:16:11.301 "zcopy": true, 00:16:11.301 "get_zone_info": false, 00:16:11.301 "zone_management": false, 00:16:11.301 "zone_append": false, 00:16:11.301 "compare": false, 00:16:11.301 "compare_and_write": false, 00:16:11.301 "abort": true, 00:16:11.301 "seek_hole": false, 00:16:11.301 "seek_data": false, 00:16:11.301 "copy": true, 00:16:11.301 "nvme_iov_md": false 00:16:11.301 }, 00:16:11.301 "memory_domains": [ 00:16:11.301 { 00:16:11.301 "dma_device_id": "system", 00:16:11.301 "dma_device_type": 1 00:16:11.301 }, 00:16:11.301 { 00:16:11.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.301 "dma_device_type": 2 00:16:11.301 } 00:16:11.301 ], 00:16:11.301 "driver_specific": {} 00:16:11.301 } 00:16:11.301 ] 00:16:11.301 13:24:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:11.301 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:11.301 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.301 13:24:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:11.301 BaseBdev3 00:16:11.301 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:11.301 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:11.301 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:11.301 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:11.301 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:11.301 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:11.301 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:11.560 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:11.819 [ 00:16:11.819 { 00:16:11.819 "name": "BaseBdev3", 00:16:11.819 "aliases": [ 00:16:11.819 "f56533a3-66b3-46e7-bff8-ad258d271761" 00:16:11.819 ], 00:16:11.819 "product_name": "Malloc disk", 00:16:11.819 "block_size": 512, 00:16:11.819 "num_blocks": 65536, 00:16:11.819 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:11.819 "assigned_rate_limits": { 00:16:11.819 "rw_ios_per_sec": 0, 00:16:11.819 "rw_mbytes_per_sec": 0, 00:16:11.819 "r_mbytes_per_sec": 0, 00:16:11.819 "w_mbytes_per_sec": 0 00:16:11.819 }, 00:16:11.819 "claimed": false, 00:16:11.819 "zoned": false, 00:16:11.819 "supported_io_types": { 00:16:11.819 "read": true, 00:16:11.819 "write": true, 00:16:11.819 "unmap": true, 00:16:11.819 "flush": true, 00:16:11.819 "reset": true, 00:16:11.819 "nvme_admin": false, 00:16:11.819 "nvme_io": false, 00:16:11.819 "nvme_io_md": false, 00:16:11.819 "write_zeroes": true, 00:16:11.819 "zcopy": true, 00:16:11.819 "get_zone_info": false, 00:16:11.819 "zone_management": false, 00:16:11.819 "zone_append": false, 00:16:11.819 "compare": false, 00:16:11.819 "compare_and_write": false, 00:16:11.819 "abort": true, 00:16:11.819 "seek_hole": false, 00:16:11.819 "seek_data": false, 00:16:11.819 "copy": true, 00:16:11.819 "nvme_iov_md": false 00:16:11.819 }, 00:16:11.819 "memory_domains": [ 00:16:11.819 { 00:16:11.819 "dma_device_id": "system", 00:16:11.819 "dma_device_type": 1 00:16:11.819 }, 00:16:11.819 { 00:16:11.819 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:11.819 "dma_device_type": 2 00:16:11.819 } 00:16:11.819 ], 00:16:11.819 "driver_specific": {} 00:16:11.819 } 00:16:11.819 ] 00:16:11.819 13:24:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:11.819 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:11.819 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:11.819 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:12.079 [2024-07-25 13:24:52.762511] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:12.079 [2024-07-25 13:24:52.762540] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:12.079 [2024-07-25 13:24:52.762556] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:12.079 [2024-07-25 13:24:52.763592] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:12.079 13:24:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:12.647 13:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:12.647 "name": "Existed_Raid", 00:16:12.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.647 "strip_size_kb": 0, 00:16:12.647 "state": "configuring", 00:16:12.647 "raid_level": "raid1", 00:16:12.647 "superblock": false, 00:16:12.647 "num_base_bdevs": 3, 00:16:12.647 "num_base_bdevs_discovered": 2, 00:16:12.647 "num_base_bdevs_operational": 3, 00:16:12.647 "base_bdevs_list": [ 00:16:12.647 { 00:16:12.647 "name": "BaseBdev1", 00:16:12.647 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:12.647 "is_configured": false, 00:16:12.647 "data_offset": 0, 00:16:12.647 "data_size": 0 00:16:12.647 }, 00:16:12.647 { 00:16:12.647 "name": "BaseBdev2", 00:16:12.647 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:12.647 "is_configured": true, 00:16:12.647 "data_offset": 0, 00:16:12.647 "data_size": 65536 00:16:12.647 }, 00:16:12.647 { 00:16:12.647 "name": "BaseBdev3", 00:16:12.647 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:12.647 "is_configured": true, 00:16:12.647 "data_offset": 0, 00:16:12.647 "data_size": 65536 00:16:12.647 } 00:16:12.647 ] 00:16:12.647 }' 00:16:12.647 13:24:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:12.647 13:24:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.024 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:14.024 [2024-07-25 13:24:54.607199] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:14.025 "name": "Existed_Raid", 00:16:14.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.025 "strip_size_kb": 0, 00:16:14.025 "state": "configuring", 00:16:14.025 "raid_level": "raid1", 00:16:14.025 "superblock": false, 00:16:14.025 "num_base_bdevs": 3, 00:16:14.025 "num_base_bdevs_discovered": 1, 00:16:14.025 "num_base_bdevs_operational": 3, 00:16:14.025 "base_bdevs_list": [ 00:16:14.025 { 00:16:14.025 "name": "BaseBdev1", 00:16:14.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:14.025 "is_configured": false, 00:16:14.025 "data_offset": 0, 00:16:14.025 "data_size": 0 00:16:14.025 }, 00:16:14.025 { 00:16:14.025 "name": null, 00:16:14.025 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:14.025 "is_configured": false, 00:16:14.025 "data_offset": 0, 00:16:14.025 "data_size": 65536 00:16:14.025 }, 00:16:14.025 { 00:16:14.025 "name": "BaseBdev3", 00:16:14.025 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:14.025 "is_configured": true, 00:16:14.025 "data_offset": 0, 00:16:14.025 "data_size": 65536 00:16:14.025 } 00:16:14.025 ] 00:16:14.025 }' 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:14.025 13:24:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:15.404 13:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.404 13:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:15.404 13:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:15.404 13:24:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:15.404 [2024-07-25 13:24:56.111905] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:15.404 BaseBdev1 00:16:15.404 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:15.404 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:15.404 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:15.404 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:15.404 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:15.404 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:15.404 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:15.664 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:15.923 [ 00:16:15.923 { 00:16:15.923 "name": "BaseBdev1", 00:16:15.923 "aliases": [ 00:16:15.923 "5b0e8fd0-27fc-4eba-8840-dab32e34c684" 00:16:15.923 ], 00:16:15.923 "product_name": "Malloc disk", 00:16:15.923 "block_size": 512, 00:16:15.923 "num_blocks": 65536, 00:16:15.923 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:15.923 "assigned_rate_limits": { 00:16:15.923 "rw_ios_per_sec": 0, 00:16:15.923 "rw_mbytes_per_sec": 0, 00:16:15.923 "r_mbytes_per_sec": 0, 00:16:15.923 "w_mbytes_per_sec": 0 00:16:15.923 }, 00:16:15.923 "claimed": true, 00:16:15.923 "claim_type": "exclusive_write", 00:16:15.923 "zoned": false, 00:16:15.923 "supported_io_types": { 00:16:15.923 "read": true, 00:16:15.923 "write": true, 00:16:15.923 "unmap": true, 00:16:15.923 "flush": true, 00:16:15.923 "reset": true, 00:16:15.923 "nvme_admin": false, 00:16:15.923 "nvme_io": false, 00:16:15.923 "nvme_io_md": false, 00:16:15.923 "write_zeroes": true, 00:16:15.923 "zcopy": true, 00:16:15.923 "get_zone_info": false, 00:16:15.923 "zone_management": false, 00:16:15.923 "zone_append": false, 00:16:15.923 "compare": false, 00:16:15.923 "compare_and_write": false, 00:16:15.923 "abort": true, 00:16:15.923 "seek_hole": false, 00:16:15.923 "seek_data": false, 00:16:15.923 "copy": true, 00:16:15.923 "nvme_iov_md": false 00:16:15.923 }, 00:16:15.923 "memory_domains": [ 00:16:15.923 { 00:16:15.923 "dma_device_id": "system", 00:16:15.923 "dma_device_type": 1 00:16:15.923 }, 00:16:15.923 { 00:16:15.923 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:15.923 "dma_device_type": 2 00:16:15.923 } 00:16:15.923 ], 00:16:15.923 "driver_specific": {} 00:16:15.923 } 00:16:15.923 ] 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:15.923 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:16.182 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:16.182 "name": "Existed_Raid", 00:16:16.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:16.182 "strip_size_kb": 0, 00:16:16.182 "state": "configuring", 00:16:16.182 "raid_level": "raid1", 00:16:16.182 "superblock": false, 00:16:16.182 "num_base_bdevs": 3, 00:16:16.182 "num_base_bdevs_discovered": 2, 00:16:16.182 "num_base_bdevs_operational": 3, 00:16:16.182 "base_bdevs_list": [ 00:16:16.182 { 00:16:16.182 "name": "BaseBdev1", 00:16:16.182 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:16.182 "is_configured": true, 00:16:16.182 "data_offset": 0, 00:16:16.182 "data_size": 65536 00:16:16.182 }, 00:16:16.182 { 00:16:16.182 "name": null, 00:16:16.182 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:16.182 "is_configured": false, 00:16:16.182 "data_offset": 0, 00:16:16.182 "data_size": 65536 00:16:16.182 }, 00:16:16.182 { 00:16:16.182 "name": "BaseBdev3", 00:16:16.182 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:16.182 "is_configured": true, 00:16:16.182 "data_offset": 0, 00:16:16.182 "data_size": 65536 00:16:16.182 } 00:16:16.182 ] 00:16:16.182 }' 00:16:16.182 13:24:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:16.182 13:24:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:16.750 13:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:16.750 13:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:17.009 13:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:17.009 13:24:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:17.268 [2024-07-25 13:24:58.056854] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:17.527 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:18.093 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.093 "name": "Existed_Raid", 00:16:18.093 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:18.093 "strip_size_kb": 0, 00:16:18.093 "state": "configuring", 00:16:18.093 "raid_level": "raid1", 00:16:18.093 "superblock": false, 00:16:18.093 "num_base_bdevs": 3, 00:16:18.093 "num_base_bdevs_discovered": 1, 00:16:18.093 "num_base_bdevs_operational": 3, 00:16:18.093 "base_bdevs_list": [ 00:16:18.093 { 00:16:18.093 "name": "BaseBdev1", 00:16:18.093 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:18.093 "is_configured": true, 00:16:18.093 "data_offset": 0, 00:16:18.093 "data_size": 65536 00:16:18.093 }, 00:16:18.093 { 00:16:18.093 "name": null, 00:16:18.093 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:18.093 "is_configured": false, 00:16:18.093 "data_offset": 0, 00:16:18.093 "data_size": 65536 00:16:18.093 }, 00:16:18.093 { 00:16:18.093 "name": null, 00:16:18.093 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:18.093 "is_configured": false, 00:16:18.093 "data_offset": 0, 00:16:18.093 "data_size": 65536 00:16:18.093 } 00:16:18.093 ] 00:16:18.093 }' 00:16:18.093 13:24:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.093 13:24:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:18.659 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.659 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:18.659 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:18.659 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:19.226 [2024-07-25 13:24:59.841401] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:19.226 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:19.226 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:19.226 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:19.226 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:19.227 13:24:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:19.793 13:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:19.793 "name": "Existed_Raid", 00:16:19.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:19.793 "strip_size_kb": 0, 00:16:19.793 "state": "configuring", 00:16:19.793 "raid_level": "raid1", 00:16:19.793 "superblock": false, 00:16:19.793 "num_base_bdevs": 3, 00:16:19.793 "num_base_bdevs_discovered": 2, 00:16:19.793 "num_base_bdevs_operational": 3, 00:16:19.793 "base_bdevs_list": [ 00:16:19.793 { 00:16:19.793 "name": "BaseBdev1", 00:16:19.793 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:19.793 "is_configured": true, 00:16:19.793 "data_offset": 0, 00:16:19.793 "data_size": 65536 00:16:19.793 }, 00:16:19.793 { 00:16:19.793 "name": null, 00:16:19.793 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:19.793 "is_configured": false, 00:16:19.793 "data_offset": 0, 00:16:19.793 "data_size": 65536 00:16:19.793 }, 00:16:19.793 { 00:16:19.793 "name": "BaseBdev3", 00:16:19.793 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:19.793 "is_configured": true, 00:16:19.793 "data_offset": 0, 00:16:19.793 "data_size": 65536 00:16:19.793 } 00:16:19.793 ] 00:16:19.793 }' 00:16:19.793 13:25:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:19.793 13:25:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:20.727 13:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.727 13:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:20.985 13:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:20.985 13:25:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:21.554 [2024-07-25 13:25:02.063011] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:21.554 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:22.123 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:22.123 "name": "Existed_Raid", 00:16:22.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:22.123 "strip_size_kb": 0, 00:16:22.123 "state": "configuring", 00:16:22.123 "raid_level": "raid1", 00:16:22.123 "superblock": false, 00:16:22.123 "num_base_bdevs": 3, 00:16:22.123 "num_base_bdevs_discovered": 1, 00:16:22.123 "num_base_bdevs_operational": 3, 00:16:22.123 "base_bdevs_list": [ 00:16:22.123 { 00:16:22.123 "name": null, 00:16:22.123 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:22.123 "is_configured": false, 00:16:22.123 "data_offset": 0, 00:16:22.123 "data_size": 65536 00:16:22.123 }, 00:16:22.123 { 00:16:22.123 "name": null, 00:16:22.123 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:22.123 "is_configured": false, 00:16:22.123 "data_offset": 0, 00:16:22.123 "data_size": 65536 00:16:22.123 }, 00:16:22.123 { 00:16:22.123 "name": "BaseBdev3", 00:16:22.123 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:22.123 "is_configured": true, 00:16:22.123 "data_offset": 0, 00:16:22.123 "data_size": 65536 00:16:22.123 } 00:16:22.123 ] 00:16:22.123 }' 00:16:22.123 13:25:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:22.123 13:25:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.059 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.059 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:23.059 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:23.059 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:23.318 [2024-07-25 13:25:03.893571] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.318 13:25:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.318 13:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.318 "name": "Existed_Raid", 00:16:23.318 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.318 "strip_size_kb": 0, 00:16:23.318 "state": "configuring", 00:16:23.318 "raid_level": "raid1", 00:16:23.318 "superblock": false, 00:16:23.318 "num_base_bdevs": 3, 00:16:23.318 "num_base_bdevs_discovered": 2, 00:16:23.318 "num_base_bdevs_operational": 3, 00:16:23.318 "base_bdevs_list": [ 00:16:23.318 { 00:16:23.318 "name": null, 00:16:23.318 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:23.318 "is_configured": false, 00:16:23.318 "data_offset": 0, 00:16:23.318 "data_size": 65536 00:16:23.318 }, 00:16:23.318 { 00:16:23.318 "name": "BaseBdev2", 00:16:23.318 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:23.318 "is_configured": true, 00:16:23.318 "data_offset": 0, 00:16:23.318 "data_size": 65536 00:16:23.318 }, 00:16:23.318 { 00:16:23.318 "name": "BaseBdev3", 00:16:23.318 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:23.318 "is_configured": true, 00:16:23.318 "data_offset": 0, 00:16:23.318 "data_size": 65536 00:16:23.318 } 00:16:23.318 ] 00:16:23.318 }' 00:16:23.318 13:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.318 13:25:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.888 13:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.888 13:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:24.146 13:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:24.146 13:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.146 13:25:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:24.405 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 5b0e8fd0-27fc-4eba-8840-dab32e34c684 00:16:24.663 [2024-07-25 13:25:05.225931] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:24.663 [2024-07-25 13:25:05.225958] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fb81d0 00:16:24.663 [2024-07-25 13:25:05.225963] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:16:24.663 [2024-07-25 13:25:05.226107] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fbaac0 00:16:24.663 [2024-07-25 13:25:05.226204] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fb81d0 00:16:24.663 [2024-07-25 13:25:05.226210] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1fb81d0 00:16:24.663 [2024-07-25 13:25:05.226327] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:24.663 NewBaseBdev 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:24.663 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:24.922 [ 00:16:24.922 { 00:16:24.922 "name": "NewBaseBdev", 00:16:24.922 "aliases": [ 00:16:24.922 "5b0e8fd0-27fc-4eba-8840-dab32e34c684" 00:16:24.922 ], 00:16:24.922 "product_name": "Malloc disk", 00:16:24.922 "block_size": 512, 00:16:24.922 "num_blocks": 65536, 00:16:24.922 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:24.922 "assigned_rate_limits": { 00:16:24.922 "rw_ios_per_sec": 0, 00:16:24.922 "rw_mbytes_per_sec": 0, 00:16:24.922 "r_mbytes_per_sec": 0, 00:16:24.922 "w_mbytes_per_sec": 0 00:16:24.922 }, 00:16:24.922 "claimed": true, 00:16:24.922 "claim_type": "exclusive_write", 00:16:24.922 "zoned": false, 00:16:24.922 "supported_io_types": { 00:16:24.922 "read": true, 00:16:24.922 "write": true, 00:16:24.922 "unmap": true, 00:16:24.922 "flush": true, 00:16:24.922 "reset": true, 00:16:24.922 "nvme_admin": false, 00:16:24.922 "nvme_io": false, 00:16:24.922 "nvme_io_md": false, 00:16:24.922 "write_zeroes": true, 00:16:24.922 "zcopy": true, 00:16:24.922 "get_zone_info": false, 00:16:24.922 "zone_management": false, 00:16:24.922 "zone_append": false, 00:16:24.922 "compare": false, 00:16:24.922 "compare_and_write": false, 00:16:24.922 "abort": true, 00:16:24.922 "seek_hole": false, 00:16:24.922 "seek_data": false, 00:16:24.922 "copy": true, 00:16:24.922 "nvme_iov_md": false 00:16:24.922 }, 00:16:24.922 "memory_domains": [ 00:16:24.922 { 00:16:24.922 "dma_device_id": "system", 00:16:24.922 "dma_device_type": 1 00:16:24.922 }, 00:16:24.922 { 00:16:24.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.922 "dma_device_type": 2 00:16:24.922 } 00:16:24.922 ], 00:16:24.922 "driver_specific": {} 00:16:24.922 } 00:16:24.922 ] 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.922 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.183 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.183 "name": "Existed_Raid", 00:16:25.183 "uuid": "801531de-94f2-411d-8b44-653452ffc1b9", 00:16:25.183 "strip_size_kb": 0, 00:16:25.183 "state": "online", 00:16:25.183 "raid_level": "raid1", 00:16:25.184 "superblock": false, 00:16:25.184 "num_base_bdevs": 3, 00:16:25.184 "num_base_bdevs_discovered": 3, 00:16:25.184 "num_base_bdevs_operational": 3, 00:16:25.184 "base_bdevs_list": [ 00:16:25.184 { 00:16:25.184 "name": "NewBaseBdev", 00:16:25.184 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:25.184 "is_configured": true, 00:16:25.184 "data_offset": 0, 00:16:25.184 "data_size": 65536 00:16:25.184 }, 00:16:25.184 { 00:16:25.184 "name": "BaseBdev2", 00:16:25.184 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:25.184 "is_configured": true, 00:16:25.184 "data_offset": 0, 00:16:25.184 "data_size": 65536 00:16:25.184 }, 00:16:25.184 { 00:16:25.184 "name": "BaseBdev3", 00:16:25.184 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:25.184 "is_configured": true, 00:16:25.184 "data_offset": 0, 00:16:25.184 "data_size": 65536 00:16:25.184 } 00:16:25.184 ] 00:16:25.184 }' 00:16:25.184 13:25:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.184 13:25:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:25.751 [2024-07-25 13:25:06.509446] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:25.751 "name": "Existed_Raid", 00:16:25.751 "aliases": [ 00:16:25.751 "801531de-94f2-411d-8b44-653452ffc1b9" 00:16:25.751 ], 00:16:25.751 "product_name": "Raid Volume", 00:16:25.751 "block_size": 512, 00:16:25.751 "num_blocks": 65536, 00:16:25.751 "uuid": "801531de-94f2-411d-8b44-653452ffc1b9", 00:16:25.751 "assigned_rate_limits": { 00:16:25.751 "rw_ios_per_sec": 0, 00:16:25.751 "rw_mbytes_per_sec": 0, 00:16:25.751 "r_mbytes_per_sec": 0, 00:16:25.751 "w_mbytes_per_sec": 0 00:16:25.751 }, 00:16:25.751 "claimed": false, 00:16:25.751 "zoned": false, 00:16:25.751 "supported_io_types": { 00:16:25.751 "read": true, 00:16:25.751 "write": true, 00:16:25.751 "unmap": false, 00:16:25.751 "flush": false, 00:16:25.751 "reset": true, 00:16:25.751 "nvme_admin": false, 00:16:25.751 "nvme_io": false, 00:16:25.751 "nvme_io_md": false, 00:16:25.751 "write_zeroes": true, 00:16:25.751 "zcopy": false, 00:16:25.751 "get_zone_info": false, 00:16:25.751 "zone_management": false, 00:16:25.751 "zone_append": false, 00:16:25.751 "compare": false, 00:16:25.751 "compare_and_write": false, 00:16:25.751 "abort": false, 00:16:25.751 "seek_hole": false, 00:16:25.751 "seek_data": false, 00:16:25.751 "copy": false, 00:16:25.751 "nvme_iov_md": false 00:16:25.751 }, 00:16:25.751 "memory_domains": [ 00:16:25.751 { 00:16:25.751 "dma_device_id": "system", 00:16:25.751 "dma_device_type": 1 00:16:25.751 }, 00:16:25.751 { 00:16:25.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.751 "dma_device_type": 2 00:16:25.751 }, 00:16:25.751 { 00:16:25.751 "dma_device_id": "system", 00:16:25.751 "dma_device_type": 1 00:16:25.751 }, 00:16:25.751 { 00:16:25.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.751 "dma_device_type": 2 00:16:25.751 }, 00:16:25.751 { 00:16:25.751 "dma_device_id": "system", 00:16:25.751 "dma_device_type": 1 00:16:25.751 }, 00:16:25.751 { 00:16:25.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:25.751 "dma_device_type": 2 00:16:25.751 } 00:16:25.751 ], 00:16:25.751 "driver_specific": { 00:16:25.751 "raid": { 00:16:25.751 "uuid": "801531de-94f2-411d-8b44-653452ffc1b9", 00:16:25.751 "strip_size_kb": 0, 00:16:25.751 "state": "online", 00:16:25.751 "raid_level": "raid1", 00:16:25.751 "superblock": false, 00:16:25.751 "num_base_bdevs": 3, 00:16:25.751 "num_base_bdevs_discovered": 3, 00:16:25.751 "num_base_bdevs_operational": 3, 00:16:25.751 "base_bdevs_list": [ 00:16:25.751 { 00:16:25.751 "name": "NewBaseBdev", 00:16:25.751 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:25.751 "is_configured": true, 00:16:25.751 "data_offset": 0, 00:16:25.751 "data_size": 65536 00:16:25.751 }, 00:16:25.751 { 00:16:25.751 "name": "BaseBdev2", 00:16:25.751 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:25.751 "is_configured": true, 00:16:25.751 "data_offset": 0, 00:16:25.751 "data_size": 65536 00:16:25.751 }, 00:16:25.751 { 00:16:25.751 "name": "BaseBdev3", 00:16:25.751 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:25.751 "is_configured": true, 00:16:25.751 "data_offset": 0, 00:16:25.751 "data_size": 65536 00:16:25.751 } 00:16:25.751 ] 00:16:25.751 } 00:16:25.751 } 00:16:25.751 }' 00:16:25.751 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:26.012 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:26.012 BaseBdev2 00:16:26.012 BaseBdev3' 00:16:26.012 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.012 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:26.012 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:26.012 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:26.012 "name": "NewBaseBdev", 00:16:26.012 "aliases": [ 00:16:26.012 "5b0e8fd0-27fc-4eba-8840-dab32e34c684" 00:16:26.012 ], 00:16:26.012 "product_name": "Malloc disk", 00:16:26.012 "block_size": 512, 00:16:26.012 "num_blocks": 65536, 00:16:26.012 "uuid": "5b0e8fd0-27fc-4eba-8840-dab32e34c684", 00:16:26.012 "assigned_rate_limits": { 00:16:26.012 "rw_ios_per_sec": 0, 00:16:26.012 "rw_mbytes_per_sec": 0, 00:16:26.012 "r_mbytes_per_sec": 0, 00:16:26.012 "w_mbytes_per_sec": 0 00:16:26.012 }, 00:16:26.012 "claimed": true, 00:16:26.012 "claim_type": "exclusive_write", 00:16:26.012 "zoned": false, 00:16:26.012 "supported_io_types": { 00:16:26.012 "read": true, 00:16:26.012 "write": true, 00:16:26.012 "unmap": true, 00:16:26.012 "flush": true, 00:16:26.012 "reset": true, 00:16:26.012 "nvme_admin": false, 00:16:26.012 "nvme_io": false, 00:16:26.012 "nvme_io_md": false, 00:16:26.012 "write_zeroes": true, 00:16:26.012 "zcopy": true, 00:16:26.012 "get_zone_info": false, 00:16:26.012 "zone_management": false, 00:16:26.012 "zone_append": false, 00:16:26.012 "compare": false, 00:16:26.012 "compare_and_write": false, 00:16:26.012 "abort": true, 00:16:26.012 "seek_hole": false, 00:16:26.012 "seek_data": false, 00:16:26.012 "copy": true, 00:16:26.012 "nvme_iov_md": false 00:16:26.012 }, 00:16:26.012 "memory_domains": [ 00:16:26.012 { 00:16:26.012 "dma_device_id": "system", 00:16:26.012 "dma_device_type": 1 00:16:26.012 }, 00:16:26.012 { 00:16:26.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.012 "dma_device_type": 2 00:16:26.012 } 00:16:26.012 ], 00:16:26.012 "driver_specific": {} 00:16:26.012 }' 00:16:26.012 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.271 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.271 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.271 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.271 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.271 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.271 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.271 13:25:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.271 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.271 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.530 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:26.530 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:26.530 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:26.530 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:26.530 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:26.530 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:26.530 "name": "BaseBdev2", 00:16:26.530 "aliases": [ 00:16:26.530 "26d669d7-c904-4b19-9a7b-cfc2d6d57992" 00:16:26.530 ], 00:16:26.530 "product_name": "Malloc disk", 00:16:26.530 "block_size": 512, 00:16:26.530 "num_blocks": 65536, 00:16:26.530 "uuid": "26d669d7-c904-4b19-9a7b-cfc2d6d57992", 00:16:26.530 "assigned_rate_limits": { 00:16:26.530 "rw_ios_per_sec": 0, 00:16:26.530 "rw_mbytes_per_sec": 0, 00:16:26.530 "r_mbytes_per_sec": 0, 00:16:26.530 "w_mbytes_per_sec": 0 00:16:26.530 }, 00:16:26.530 "claimed": true, 00:16:26.530 "claim_type": "exclusive_write", 00:16:26.530 "zoned": false, 00:16:26.530 "supported_io_types": { 00:16:26.530 "read": true, 00:16:26.530 "write": true, 00:16:26.530 "unmap": true, 00:16:26.530 "flush": true, 00:16:26.530 "reset": true, 00:16:26.530 "nvme_admin": false, 00:16:26.530 "nvme_io": false, 00:16:26.530 "nvme_io_md": false, 00:16:26.530 "write_zeroes": true, 00:16:26.530 "zcopy": true, 00:16:26.530 "get_zone_info": false, 00:16:26.530 "zone_management": false, 00:16:26.530 "zone_append": false, 00:16:26.530 "compare": false, 00:16:26.530 "compare_and_write": false, 00:16:26.530 "abort": true, 00:16:26.530 "seek_hole": false, 00:16:26.530 "seek_data": false, 00:16:26.530 "copy": true, 00:16:26.530 "nvme_iov_md": false 00:16:26.530 }, 00:16:26.530 "memory_domains": [ 00:16:26.530 { 00:16:26.530 "dma_device_id": "system", 00:16:26.530 "dma_device_type": 1 00:16:26.530 }, 00:16:26.530 { 00:16:26.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:26.530 "dma_device_type": 2 00:16:26.530 } 00:16:26.530 ], 00:16:26.530 "driver_specific": {} 00:16:26.530 }' 00:16:26.530 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:26.788 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.047 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.047 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.047 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:27.047 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:27.047 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:27.305 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:27.305 "name": "BaseBdev3", 00:16:27.305 "aliases": [ 00:16:27.305 "f56533a3-66b3-46e7-bff8-ad258d271761" 00:16:27.305 ], 00:16:27.305 "product_name": "Malloc disk", 00:16:27.305 "block_size": 512, 00:16:27.305 "num_blocks": 65536, 00:16:27.305 "uuid": "f56533a3-66b3-46e7-bff8-ad258d271761", 00:16:27.305 "assigned_rate_limits": { 00:16:27.305 "rw_ios_per_sec": 0, 00:16:27.305 "rw_mbytes_per_sec": 0, 00:16:27.305 "r_mbytes_per_sec": 0, 00:16:27.305 "w_mbytes_per_sec": 0 00:16:27.305 }, 00:16:27.305 "claimed": true, 00:16:27.305 "claim_type": "exclusive_write", 00:16:27.305 "zoned": false, 00:16:27.305 "supported_io_types": { 00:16:27.305 "read": true, 00:16:27.305 "write": true, 00:16:27.305 "unmap": true, 00:16:27.305 "flush": true, 00:16:27.305 "reset": true, 00:16:27.305 "nvme_admin": false, 00:16:27.305 "nvme_io": false, 00:16:27.305 "nvme_io_md": false, 00:16:27.305 "write_zeroes": true, 00:16:27.305 "zcopy": true, 00:16:27.305 "get_zone_info": false, 00:16:27.305 "zone_management": false, 00:16:27.305 "zone_append": false, 00:16:27.305 "compare": false, 00:16:27.305 "compare_and_write": false, 00:16:27.305 "abort": true, 00:16:27.305 "seek_hole": false, 00:16:27.305 "seek_data": false, 00:16:27.305 "copy": true, 00:16:27.305 "nvme_iov_md": false 00:16:27.305 }, 00:16:27.305 "memory_domains": [ 00:16:27.305 { 00:16:27.305 "dma_device_id": "system", 00:16:27.305 "dma_device_type": 1 00:16:27.305 }, 00:16:27.305 { 00:16:27.305 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.305 "dma_device_type": 2 00:16:27.305 } 00:16:27.305 ], 00:16:27.305 "driver_specific": {} 00:16:27.305 }' 00:16:27.305 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.305 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:27.305 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:27.305 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.305 13:25:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:27.305 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:27.305 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.305 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:27.305 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:27.305 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.564 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:27.564 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:27.564 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:27.564 [2024-07-25 13:25:08.341853] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:27.564 [2024-07-25 13:25:08.341869] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:27.564 [2024-07-25 13:25:08.341903] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:27.564 [2024-07-25 13:25:08.342105] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:27.564 [2024-07-25 13:25:08.342112] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fb81d0 name Existed_Raid, state offline 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 925424 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 925424 ']' 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 925424 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 925424 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 925424' 00:16:27.825 killing process with pid 925424 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 925424 00:16:27.825 [2024-07-25 13:25:08.414357] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 925424 00:16:27.825 [2024-07-25 13:25:08.429093] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:27.825 00:16:27.825 real 0m30.196s 00:16:27.825 user 0m57.030s 00:16:27.825 sys 0m3.988s 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:27.825 13:25:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.825 ************************************ 00:16:27.825 END TEST raid_state_function_test 00:16:27.825 ************************************ 00:16:27.825 13:25:08 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:16:27.825 13:25:08 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:16:27.825 13:25:08 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:27.825 13:25:08 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:28.085 ************************************ 00:16:28.085 START TEST raid_state_function_test_sb 00:16:28.085 ************************************ 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 3 true 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:28.085 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=930932 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 930932' 00:16:28.086 Process raid pid: 930932 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 930932 /var/tmp/spdk-raid.sock 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 930932 ']' 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:28.086 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:28.086 13:25:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:28.086 [2024-07-25 13:25:08.683165] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:16:28.086 [2024-07-25 13:25:08.683215] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:28.086 [2024-07-25 13:25:08.772194] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:28.086 [2024-07-25 13:25:08.838725] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:28.346 [2024-07-25 13:25:08.887909] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.346 [2024-07-25 13:25:08.887930] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:28.915 13:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:28.915 13:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:16:28.915 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:28.915 [2024-07-25 13:25:09.699591] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:28.915 [2024-07-25 13:25:09.699620] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:28.915 [2024-07-25 13:25:09.699626] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:28.915 [2024-07-25 13:25:09.699632] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:28.915 [2024-07-25 13:25:09.699637] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:28.915 [2024-07-25 13:25:09.699642] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.175 "name": "Existed_Raid", 00:16:29.175 "uuid": "71e1b038-23f4-48cd-adf8-a302ee970823", 00:16:29.175 "strip_size_kb": 0, 00:16:29.175 "state": "configuring", 00:16:29.175 "raid_level": "raid1", 00:16:29.175 "superblock": true, 00:16:29.175 "num_base_bdevs": 3, 00:16:29.175 "num_base_bdevs_discovered": 0, 00:16:29.175 "num_base_bdevs_operational": 3, 00:16:29.175 "base_bdevs_list": [ 00:16:29.175 { 00:16:29.175 "name": "BaseBdev1", 00:16:29.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.175 "is_configured": false, 00:16:29.175 "data_offset": 0, 00:16:29.175 "data_size": 0 00:16:29.175 }, 00:16:29.175 { 00:16:29.175 "name": "BaseBdev2", 00:16:29.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.175 "is_configured": false, 00:16:29.175 "data_offset": 0, 00:16:29.175 "data_size": 0 00:16:29.175 }, 00:16:29.175 { 00:16:29.175 "name": "BaseBdev3", 00:16:29.175 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.175 "is_configured": false, 00:16:29.175 "data_offset": 0, 00:16:29.175 "data_size": 0 00:16:29.175 } 00:16:29.175 ] 00:16:29.175 }' 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.175 13:25:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:29.745 13:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:30.004 [2024-07-25 13:25:10.617800] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:30.004 [2024-07-25 13:25:10.617820] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24266d0 name Existed_Raid, state configuring 00:16:30.004 13:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:30.265 [2024-07-25 13:25:10.806301] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:30.265 [2024-07-25 13:25:10.806324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:30.265 [2024-07-25 13:25:10.806329] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:30.265 [2024-07-25 13:25:10.806335] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:30.265 [2024-07-25 13:25:10.806339] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:30.265 [2024-07-25 13:25:10.806344] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:30.265 13:25:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:30.265 [2024-07-25 13:25:11.001381] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:30.265 BaseBdev1 00:16:30.265 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:30.265 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:30.265 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:30.265 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:30.265 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:30.265 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:30.265 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.525 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:30.785 [ 00:16:30.785 { 00:16:30.785 "name": "BaseBdev1", 00:16:30.785 "aliases": [ 00:16:30.785 "3b632447-836f-4b1e-9929-cc3fdf9d30ba" 00:16:30.785 ], 00:16:30.785 "product_name": "Malloc disk", 00:16:30.785 "block_size": 512, 00:16:30.785 "num_blocks": 65536, 00:16:30.785 "uuid": "3b632447-836f-4b1e-9929-cc3fdf9d30ba", 00:16:30.785 "assigned_rate_limits": { 00:16:30.785 "rw_ios_per_sec": 0, 00:16:30.785 "rw_mbytes_per_sec": 0, 00:16:30.785 "r_mbytes_per_sec": 0, 00:16:30.785 "w_mbytes_per_sec": 0 00:16:30.785 }, 00:16:30.785 "claimed": true, 00:16:30.785 "claim_type": "exclusive_write", 00:16:30.785 "zoned": false, 00:16:30.785 "supported_io_types": { 00:16:30.785 "read": true, 00:16:30.785 "write": true, 00:16:30.785 "unmap": true, 00:16:30.785 "flush": true, 00:16:30.785 "reset": true, 00:16:30.785 "nvme_admin": false, 00:16:30.785 "nvme_io": false, 00:16:30.785 "nvme_io_md": false, 00:16:30.785 "write_zeroes": true, 00:16:30.785 "zcopy": true, 00:16:30.785 "get_zone_info": false, 00:16:30.785 "zone_management": false, 00:16:30.785 "zone_append": false, 00:16:30.785 "compare": false, 00:16:30.785 "compare_and_write": false, 00:16:30.785 "abort": true, 00:16:30.785 "seek_hole": false, 00:16:30.785 "seek_data": false, 00:16:30.785 "copy": true, 00:16:30.785 "nvme_iov_md": false 00:16:30.785 }, 00:16:30.785 "memory_domains": [ 00:16:30.785 { 00:16:30.785 "dma_device_id": "system", 00:16:30.785 "dma_device_type": 1 00:16:30.785 }, 00:16:30.785 { 00:16:30.785 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.785 "dma_device_type": 2 00:16:30.785 } 00:16:30.785 ], 00:16:30.785 "driver_specific": {} 00:16:30.785 } 00:16:30.785 ] 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.785 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:31.045 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:31.045 "name": "Existed_Raid", 00:16:31.045 "uuid": "f0ef2d34-136d-46fc-a0fb-3e88159f1f7c", 00:16:31.045 "strip_size_kb": 0, 00:16:31.045 "state": "configuring", 00:16:31.045 "raid_level": "raid1", 00:16:31.045 "superblock": true, 00:16:31.045 "num_base_bdevs": 3, 00:16:31.045 "num_base_bdevs_discovered": 1, 00:16:31.045 "num_base_bdevs_operational": 3, 00:16:31.045 "base_bdevs_list": [ 00:16:31.045 { 00:16:31.045 "name": "BaseBdev1", 00:16:31.045 "uuid": "3b632447-836f-4b1e-9929-cc3fdf9d30ba", 00:16:31.045 "is_configured": true, 00:16:31.045 "data_offset": 2048, 00:16:31.045 "data_size": 63488 00:16:31.045 }, 00:16:31.045 { 00:16:31.045 "name": "BaseBdev2", 00:16:31.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.045 "is_configured": false, 00:16:31.045 "data_offset": 0, 00:16:31.045 "data_size": 0 00:16:31.045 }, 00:16:31.045 { 00:16:31.045 "name": "BaseBdev3", 00:16:31.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:31.045 "is_configured": false, 00:16:31.045 "data_offset": 0, 00:16:31.045 "data_size": 0 00:16:31.045 } 00:16:31.045 ] 00:16:31.045 }' 00:16:31.045 13:25:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:31.045 13:25:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:31.614 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:31.614 [2024-07-25 13:25:12.304686] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:31.614 [2024-07-25 13:25:12.304712] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2425fa0 name Existed_Raid, state configuring 00:16:31.614 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:31.874 [2024-07-25 13:25:12.501213] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:31.874 [2024-07-25 13:25:12.502318] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:31.874 [2024-07-25 13:25:12.502342] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:31.874 [2024-07-25 13:25:12.502348] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:31.874 [2024-07-25 13:25:12.502353] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:31.874 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:32.135 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:32.135 "name": "Existed_Raid", 00:16:32.135 "uuid": "2d860238-4c8a-4dde-a2c7-663b7a3bb7eb", 00:16:32.135 "strip_size_kb": 0, 00:16:32.135 "state": "configuring", 00:16:32.135 "raid_level": "raid1", 00:16:32.135 "superblock": true, 00:16:32.135 "num_base_bdevs": 3, 00:16:32.135 "num_base_bdevs_discovered": 1, 00:16:32.135 "num_base_bdevs_operational": 3, 00:16:32.135 "base_bdevs_list": [ 00:16:32.135 { 00:16:32.135 "name": "BaseBdev1", 00:16:32.135 "uuid": "3b632447-836f-4b1e-9929-cc3fdf9d30ba", 00:16:32.135 "is_configured": true, 00:16:32.135 "data_offset": 2048, 00:16:32.135 "data_size": 63488 00:16:32.135 }, 00:16:32.135 { 00:16:32.135 "name": "BaseBdev2", 00:16:32.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.135 "is_configured": false, 00:16:32.135 "data_offset": 0, 00:16:32.135 "data_size": 0 00:16:32.135 }, 00:16:32.135 { 00:16:32.135 "name": "BaseBdev3", 00:16:32.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:32.135 "is_configured": false, 00:16:32.135 "data_offset": 0, 00:16:32.135 "data_size": 0 00:16:32.135 } 00:16:32.135 ] 00:16:32.135 }' 00:16:32.135 13:25:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:32.135 13:25:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:32.705 [2024-07-25 13:25:13.444600] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:32.705 BaseBdev2 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:32.705 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:32.965 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:33.225 [ 00:16:33.225 { 00:16:33.225 "name": "BaseBdev2", 00:16:33.225 "aliases": [ 00:16:33.225 "f97ae34e-3385-4b31-88ea-539786ae3a94" 00:16:33.225 ], 00:16:33.225 "product_name": "Malloc disk", 00:16:33.225 "block_size": 512, 00:16:33.225 "num_blocks": 65536, 00:16:33.225 "uuid": "f97ae34e-3385-4b31-88ea-539786ae3a94", 00:16:33.225 "assigned_rate_limits": { 00:16:33.225 "rw_ios_per_sec": 0, 00:16:33.225 "rw_mbytes_per_sec": 0, 00:16:33.225 "r_mbytes_per_sec": 0, 00:16:33.225 "w_mbytes_per_sec": 0 00:16:33.225 }, 00:16:33.225 "claimed": true, 00:16:33.225 "claim_type": "exclusive_write", 00:16:33.225 "zoned": false, 00:16:33.225 "supported_io_types": { 00:16:33.225 "read": true, 00:16:33.225 "write": true, 00:16:33.225 "unmap": true, 00:16:33.225 "flush": true, 00:16:33.225 "reset": true, 00:16:33.225 "nvme_admin": false, 00:16:33.225 "nvme_io": false, 00:16:33.225 "nvme_io_md": false, 00:16:33.225 "write_zeroes": true, 00:16:33.225 "zcopy": true, 00:16:33.225 "get_zone_info": false, 00:16:33.225 "zone_management": false, 00:16:33.225 "zone_append": false, 00:16:33.225 "compare": false, 00:16:33.225 "compare_and_write": false, 00:16:33.225 "abort": true, 00:16:33.225 "seek_hole": false, 00:16:33.225 "seek_data": false, 00:16:33.225 "copy": true, 00:16:33.225 "nvme_iov_md": false 00:16:33.225 }, 00:16:33.225 "memory_domains": [ 00:16:33.225 { 00:16:33.225 "dma_device_id": "system", 00:16:33.225 "dma_device_type": 1 00:16:33.225 }, 00:16:33.225 { 00:16:33.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.225 "dma_device_type": 2 00:16:33.225 } 00:16:33.225 ], 00:16:33.225 "driver_specific": {} 00:16:33.225 } 00:16:33.225 ] 00:16:33.225 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:33.225 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:33.225 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:33.226 "name": "Existed_Raid", 00:16:33.226 "uuid": "2d860238-4c8a-4dde-a2c7-663b7a3bb7eb", 00:16:33.226 "strip_size_kb": 0, 00:16:33.226 "state": "configuring", 00:16:33.226 "raid_level": "raid1", 00:16:33.226 "superblock": true, 00:16:33.226 "num_base_bdevs": 3, 00:16:33.226 "num_base_bdevs_discovered": 2, 00:16:33.226 "num_base_bdevs_operational": 3, 00:16:33.226 "base_bdevs_list": [ 00:16:33.226 { 00:16:33.226 "name": "BaseBdev1", 00:16:33.226 "uuid": "3b632447-836f-4b1e-9929-cc3fdf9d30ba", 00:16:33.226 "is_configured": true, 00:16:33.226 "data_offset": 2048, 00:16:33.226 "data_size": 63488 00:16:33.226 }, 00:16:33.226 { 00:16:33.226 "name": "BaseBdev2", 00:16:33.226 "uuid": "f97ae34e-3385-4b31-88ea-539786ae3a94", 00:16:33.226 "is_configured": true, 00:16:33.226 "data_offset": 2048, 00:16:33.226 "data_size": 63488 00:16:33.226 }, 00:16:33.226 { 00:16:33.226 "name": "BaseBdev3", 00:16:33.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:33.226 "is_configured": false, 00:16:33.226 "data_offset": 0, 00:16:33.226 "data_size": 0 00:16:33.226 } 00:16:33.226 ] 00:16:33.226 }' 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:33.226 13:25:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:33.795 13:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:34.054 [2024-07-25 13:25:14.704449] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:34.054 [2024-07-25 13:25:14.704576] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2426ea0 00:16:34.054 [2024-07-25 13:25:14.704585] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:34.054 [2024-07-25 13:25:14.704720] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2426b70 00:16:34.054 [2024-07-25 13:25:14.704814] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2426ea0 00:16:34.054 [2024-07-25 13:25:14.704819] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2426ea0 00:16:34.054 [2024-07-25 13:25:14.704888] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:34.054 BaseBdev3 00:16:34.054 13:25:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:34.054 13:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:34.054 13:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:34.054 13:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:34.054 13:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:34.054 13:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:34.054 13:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:34.318 13:25:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:34.606 [ 00:16:34.606 { 00:16:34.606 "name": "BaseBdev3", 00:16:34.606 "aliases": [ 00:16:34.606 "889f9aa1-ec09-4739-b794-21a6e934fe24" 00:16:34.606 ], 00:16:34.606 "product_name": "Malloc disk", 00:16:34.606 "block_size": 512, 00:16:34.606 "num_blocks": 65536, 00:16:34.606 "uuid": "889f9aa1-ec09-4739-b794-21a6e934fe24", 00:16:34.606 "assigned_rate_limits": { 00:16:34.606 "rw_ios_per_sec": 0, 00:16:34.606 "rw_mbytes_per_sec": 0, 00:16:34.606 "r_mbytes_per_sec": 0, 00:16:34.606 "w_mbytes_per_sec": 0 00:16:34.606 }, 00:16:34.606 "claimed": true, 00:16:34.606 "claim_type": "exclusive_write", 00:16:34.606 "zoned": false, 00:16:34.606 "supported_io_types": { 00:16:34.606 "read": true, 00:16:34.606 "write": true, 00:16:34.606 "unmap": true, 00:16:34.606 "flush": true, 00:16:34.606 "reset": true, 00:16:34.606 "nvme_admin": false, 00:16:34.606 "nvme_io": false, 00:16:34.606 "nvme_io_md": false, 00:16:34.606 "write_zeroes": true, 00:16:34.606 "zcopy": true, 00:16:34.606 "get_zone_info": false, 00:16:34.607 "zone_management": false, 00:16:34.607 "zone_append": false, 00:16:34.607 "compare": false, 00:16:34.607 "compare_and_write": false, 00:16:34.607 "abort": true, 00:16:34.607 "seek_hole": false, 00:16:34.607 "seek_data": false, 00:16:34.607 "copy": true, 00:16:34.607 "nvme_iov_md": false 00:16:34.607 }, 00:16:34.607 "memory_domains": [ 00:16:34.607 { 00:16:34.607 "dma_device_id": "system", 00:16:34.607 "dma_device_type": 1 00:16:34.607 }, 00:16:34.607 { 00:16:34.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:34.607 "dma_device_type": 2 00:16:34.607 } 00:16:34.607 ], 00:16:34.607 "driver_specific": {} 00:16:34.607 } 00:16:34.607 ] 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.607 "name": "Existed_Raid", 00:16:34.607 "uuid": "2d860238-4c8a-4dde-a2c7-663b7a3bb7eb", 00:16:34.607 "strip_size_kb": 0, 00:16:34.607 "state": "online", 00:16:34.607 "raid_level": "raid1", 00:16:34.607 "superblock": true, 00:16:34.607 "num_base_bdevs": 3, 00:16:34.607 "num_base_bdevs_discovered": 3, 00:16:34.607 "num_base_bdevs_operational": 3, 00:16:34.607 "base_bdevs_list": [ 00:16:34.607 { 00:16:34.607 "name": "BaseBdev1", 00:16:34.607 "uuid": "3b632447-836f-4b1e-9929-cc3fdf9d30ba", 00:16:34.607 "is_configured": true, 00:16:34.607 "data_offset": 2048, 00:16:34.607 "data_size": 63488 00:16:34.607 }, 00:16:34.607 { 00:16:34.607 "name": "BaseBdev2", 00:16:34.607 "uuid": "f97ae34e-3385-4b31-88ea-539786ae3a94", 00:16:34.607 "is_configured": true, 00:16:34.607 "data_offset": 2048, 00:16:34.607 "data_size": 63488 00:16:34.607 }, 00:16:34.607 { 00:16:34.607 "name": "BaseBdev3", 00:16:34.607 "uuid": "889f9aa1-ec09-4739-b794-21a6e934fe24", 00:16:34.607 "is_configured": true, 00:16:34.607 "data_offset": 2048, 00:16:34.607 "data_size": 63488 00:16:34.607 } 00:16:34.607 ] 00:16:34.607 }' 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.607 13:25:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:35.189 13:25:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:35.449 [2024-07-25 13:25:16.052102] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:35.449 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:35.449 "name": "Existed_Raid", 00:16:35.449 "aliases": [ 00:16:35.449 "2d860238-4c8a-4dde-a2c7-663b7a3bb7eb" 00:16:35.449 ], 00:16:35.449 "product_name": "Raid Volume", 00:16:35.449 "block_size": 512, 00:16:35.449 "num_blocks": 63488, 00:16:35.449 "uuid": "2d860238-4c8a-4dde-a2c7-663b7a3bb7eb", 00:16:35.449 "assigned_rate_limits": { 00:16:35.450 "rw_ios_per_sec": 0, 00:16:35.450 "rw_mbytes_per_sec": 0, 00:16:35.450 "r_mbytes_per_sec": 0, 00:16:35.450 "w_mbytes_per_sec": 0 00:16:35.450 }, 00:16:35.450 "claimed": false, 00:16:35.450 "zoned": false, 00:16:35.450 "supported_io_types": { 00:16:35.450 "read": true, 00:16:35.450 "write": true, 00:16:35.450 "unmap": false, 00:16:35.450 "flush": false, 00:16:35.450 "reset": true, 00:16:35.450 "nvme_admin": false, 00:16:35.450 "nvme_io": false, 00:16:35.450 "nvme_io_md": false, 00:16:35.450 "write_zeroes": true, 00:16:35.450 "zcopy": false, 00:16:35.450 "get_zone_info": false, 00:16:35.450 "zone_management": false, 00:16:35.450 "zone_append": false, 00:16:35.450 "compare": false, 00:16:35.450 "compare_and_write": false, 00:16:35.450 "abort": false, 00:16:35.450 "seek_hole": false, 00:16:35.450 "seek_data": false, 00:16:35.450 "copy": false, 00:16:35.450 "nvme_iov_md": false 00:16:35.450 }, 00:16:35.450 "memory_domains": [ 00:16:35.450 { 00:16:35.450 "dma_device_id": "system", 00:16:35.450 "dma_device_type": 1 00:16:35.450 }, 00:16:35.450 { 00:16:35.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.450 "dma_device_type": 2 00:16:35.450 }, 00:16:35.450 { 00:16:35.450 "dma_device_id": "system", 00:16:35.450 "dma_device_type": 1 00:16:35.450 }, 00:16:35.450 { 00:16:35.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.450 "dma_device_type": 2 00:16:35.450 }, 00:16:35.450 { 00:16:35.450 "dma_device_id": "system", 00:16:35.450 "dma_device_type": 1 00:16:35.450 }, 00:16:35.450 { 00:16:35.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.450 "dma_device_type": 2 00:16:35.450 } 00:16:35.450 ], 00:16:35.450 "driver_specific": { 00:16:35.450 "raid": { 00:16:35.450 "uuid": "2d860238-4c8a-4dde-a2c7-663b7a3bb7eb", 00:16:35.450 "strip_size_kb": 0, 00:16:35.450 "state": "online", 00:16:35.450 "raid_level": "raid1", 00:16:35.450 "superblock": true, 00:16:35.450 "num_base_bdevs": 3, 00:16:35.450 "num_base_bdevs_discovered": 3, 00:16:35.450 "num_base_bdevs_operational": 3, 00:16:35.450 "base_bdevs_list": [ 00:16:35.450 { 00:16:35.450 "name": "BaseBdev1", 00:16:35.450 "uuid": "3b632447-836f-4b1e-9929-cc3fdf9d30ba", 00:16:35.450 "is_configured": true, 00:16:35.450 "data_offset": 2048, 00:16:35.450 "data_size": 63488 00:16:35.450 }, 00:16:35.450 { 00:16:35.450 "name": "BaseBdev2", 00:16:35.450 "uuid": "f97ae34e-3385-4b31-88ea-539786ae3a94", 00:16:35.450 "is_configured": true, 00:16:35.450 "data_offset": 2048, 00:16:35.450 "data_size": 63488 00:16:35.450 }, 00:16:35.450 { 00:16:35.450 "name": "BaseBdev3", 00:16:35.450 "uuid": "889f9aa1-ec09-4739-b794-21a6e934fe24", 00:16:35.450 "is_configured": true, 00:16:35.450 "data_offset": 2048, 00:16:35.450 "data_size": 63488 00:16:35.450 } 00:16:35.450 ] 00:16:35.450 } 00:16:35.450 } 00:16:35.450 }' 00:16:35.450 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:35.450 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:35.450 BaseBdev2 00:16:35.450 BaseBdev3' 00:16:35.450 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.450 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:35.450 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:35.711 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:35.711 "name": "BaseBdev1", 00:16:35.711 "aliases": [ 00:16:35.711 "3b632447-836f-4b1e-9929-cc3fdf9d30ba" 00:16:35.711 ], 00:16:35.711 "product_name": "Malloc disk", 00:16:35.711 "block_size": 512, 00:16:35.711 "num_blocks": 65536, 00:16:35.711 "uuid": "3b632447-836f-4b1e-9929-cc3fdf9d30ba", 00:16:35.711 "assigned_rate_limits": { 00:16:35.711 "rw_ios_per_sec": 0, 00:16:35.711 "rw_mbytes_per_sec": 0, 00:16:35.711 "r_mbytes_per_sec": 0, 00:16:35.711 "w_mbytes_per_sec": 0 00:16:35.711 }, 00:16:35.711 "claimed": true, 00:16:35.711 "claim_type": "exclusive_write", 00:16:35.711 "zoned": false, 00:16:35.711 "supported_io_types": { 00:16:35.711 "read": true, 00:16:35.711 "write": true, 00:16:35.711 "unmap": true, 00:16:35.711 "flush": true, 00:16:35.711 "reset": true, 00:16:35.711 "nvme_admin": false, 00:16:35.711 "nvme_io": false, 00:16:35.711 "nvme_io_md": false, 00:16:35.711 "write_zeroes": true, 00:16:35.711 "zcopy": true, 00:16:35.711 "get_zone_info": false, 00:16:35.711 "zone_management": false, 00:16:35.711 "zone_append": false, 00:16:35.711 "compare": false, 00:16:35.711 "compare_and_write": false, 00:16:35.711 "abort": true, 00:16:35.711 "seek_hole": false, 00:16:35.711 "seek_data": false, 00:16:35.711 "copy": true, 00:16:35.711 "nvme_iov_md": false 00:16:35.711 }, 00:16:35.711 "memory_domains": [ 00:16:35.711 { 00:16:35.711 "dma_device_id": "system", 00:16:35.711 "dma_device_type": 1 00:16:35.711 }, 00:16:35.711 { 00:16:35.711 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:35.711 "dma_device_type": 2 00:16:35.711 } 00:16:35.711 ], 00:16:35.711 "driver_specific": {} 00:16:35.711 }' 00:16:35.711 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.711 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:35.711 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:35.711 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.711 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:35.972 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.232 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.232 "name": "BaseBdev2", 00:16:36.232 "aliases": [ 00:16:36.232 "f97ae34e-3385-4b31-88ea-539786ae3a94" 00:16:36.232 ], 00:16:36.232 "product_name": "Malloc disk", 00:16:36.232 "block_size": 512, 00:16:36.232 "num_blocks": 65536, 00:16:36.232 "uuid": "f97ae34e-3385-4b31-88ea-539786ae3a94", 00:16:36.232 "assigned_rate_limits": { 00:16:36.232 "rw_ios_per_sec": 0, 00:16:36.232 "rw_mbytes_per_sec": 0, 00:16:36.232 "r_mbytes_per_sec": 0, 00:16:36.232 "w_mbytes_per_sec": 0 00:16:36.232 }, 00:16:36.232 "claimed": true, 00:16:36.232 "claim_type": "exclusive_write", 00:16:36.232 "zoned": false, 00:16:36.232 "supported_io_types": { 00:16:36.232 "read": true, 00:16:36.232 "write": true, 00:16:36.232 "unmap": true, 00:16:36.232 "flush": true, 00:16:36.232 "reset": true, 00:16:36.232 "nvme_admin": false, 00:16:36.232 "nvme_io": false, 00:16:36.232 "nvme_io_md": false, 00:16:36.232 "write_zeroes": true, 00:16:36.232 "zcopy": true, 00:16:36.232 "get_zone_info": false, 00:16:36.232 "zone_management": false, 00:16:36.232 "zone_append": false, 00:16:36.232 "compare": false, 00:16:36.232 "compare_and_write": false, 00:16:36.232 "abort": true, 00:16:36.232 "seek_hole": false, 00:16:36.232 "seek_data": false, 00:16:36.232 "copy": true, 00:16:36.232 "nvme_iov_md": false 00:16:36.232 }, 00:16:36.232 "memory_domains": [ 00:16:36.232 { 00:16:36.232 "dma_device_id": "system", 00:16:36.232 "dma_device_type": 1 00:16:36.232 }, 00:16:36.232 { 00:16:36.232 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.232 "dma_device_type": 2 00:16:36.232 } 00:16:36.232 ], 00:16:36.232 "driver_specific": {} 00:16:36.232 }' 00:16:36.232 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.232 13:25:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.232 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:36.232 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:36.492 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:36.751 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:36.751 "name": "BaseBdev3", 00:16:36.751 "aliases": [ 00:16:36.751 "889f9aa1-ec09-4739-b794-21a6e934fe24" 00:16:36.751 ], 00:16:36.751 "product_name": "Malloc disk", 00:16:36.751 "block_size": 512, 00:16:36.751 "num_blocks": 65536, 00:16:36.751 "uuid": "889f9aa1-ec09-4739-b794-21a6e934fe24", 00:16:36.751 "assigned_rate_limits": { 00:16:36.751 "rw_ios_per_sec": 0, 00:16:36.751 "rw_mbytes_per_sec": 0, 00:16:36.751 "r_mbytes_per_sec": 0, 00:16:36.751 "w_mbytes_per_sec": 0 00:16:36.751 }, 00:16:36.751 "claimed": true, 00:16:36.751 "claim_type": "exclusive_write", 00:16:36.751 "zoned": false, 00:16:36.751 "supported_io_types": { 00:16:36.751 "read": true, 00:16:36.751 "write": true, 00:16:36.751 "unmap": true, 00:16:36.751 "flush": true, 00:16:36.751 "reset": true, 00:16:36.751 "nvme_admin": false, 00:16:36.751 "nvme_io": false, 00:16:36.751 "nvme_io_md": false, 00:16:36.751 "write_zeroes": true, 00:16:36.751 "zcopy": true, 00:16:36.751 "get_zone_info": false, 00:16:36.751 "zone_management": false, 00:16:36.751 "zone_append": false, 00:16:36.751 "compare": false, 00:16:36.751 "compare_and_write": false, 00:16:36.751 "abort": true, 00:16:36.751 "seek_hole": false, 00:16:36.751 "seek_data": false, 00:16:36.751 "copy": true, 00:16:36.751 "nvme_iov_md": false 00:16:36.751 }, 00:16:36.751 "memory_domains": [ 00:16:36.751 { 00:16:36.751 "dma_device_id": "system", 00:16:36.751 "dma_device_type": 1 00:16:36.751 }, 00:16:36.751 { 00:16:36.751 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:36.752 "dma_device_type": 2 00:16:36.752 } 00:16:36.752 ], 00:16:36.752 "driver_specific": {} 00:16:36.752 }' 00:16:36.752 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:36.752 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:37.011 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:37.011 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.011 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:37.011 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:37.012 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.012 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:37.012 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:37.012 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.012 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:37.271 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:37.272 13:25:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:37.272 [2024-07-25 13:25:17.988805] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:37.272 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:37.532 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:37.532 "name": "Existed_Raid", 00:16:37.532 "uuid": "2d860238-4c8a-4dde-a2c7-663b7a3bb7eb", 00:16:37.532 "strip_size_kb": 0, 00:16:37.532 "state": "online", 00:16:37.532 "raid_level": "raid1", 00:16:37.532 "superblock": true, 00:16:37.532 "num_base_bdevs": 3, 00:16:37.532 "num_base_bdevs_discovered": 2, 00:16:37.532 "num_base_bdevs_operational": 2, 00:16:37.532 "base_bdevs_list": [ 00:16:37.532 { 00:16:37.532 "name": null, 00:16:37.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:37.532 "is_configured": false, 00:16:37.532 "data_offset": 2048, 00:16:37.532 "data_size": 63488 00:16:37.532 }, 00:16:37.532 { 00:16:37.532 "name": "BaseBdev2", 00:16:37.532 "uuid": "f97ae34e-3385-4b31-88ea-539786ae3a94", 00:16:37.532 "is_configured": true, 00:16:37.532 "data_offset": 2048, 00:16:37.532 "data_size": 63488 00:16:37.532 }, 00:16:37.532 { 00:16:37.532 "name": "BaseBdev3", 00:16:37.532 "uuid": "889f9aa1-ec09-4739-b794-21a6e934fe24", 00:16:37.532 "is_configured": true, 00:16:37.532 "data_offset": 2048, 00:16:37.532 "data_size": 63488 00:16:37.532 } 00:16:37.532 ] 00:16:37.532 }' 00:16:37.532 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:37.532 13:25:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:38.102 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:38.102 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.102 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.102 13:25:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:38.362 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:38.362 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:38.362 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:38.622 [2024-07-25 13:25:19.203863] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:38.622 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:38.622 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.622 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.623 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:38.883 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:38.883 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:38.883 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:38.883 [2024-07-25 13:25:19.606607] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:38.883 [2024-07-25 13:25:19.606670] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:38.883 [2024-07-25 13:25:19.612623] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:38.883 [2024-07-25 13:25:19.612647] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:38.883 [2024-07-25 13:25:19.612653] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2426ea0 name Existed_Raid, state offline 00:16:38.883 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:38.883 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:38.883 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:38.883 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:39.142 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:39.142 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:39.142 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:16:39.142 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:39.142 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:39.142 13:25:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:39.402 BaseBdev2 00:16:39.402 13:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:39.402 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:16:39.402 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:39.402 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:39.402 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:39.402 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:39.403 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:39.663 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:39.663 [ 00:16:39.663 { 00:16:39.663 "name": "BaseBdev2", 00:16:39.663 "aliases": [ 00:16:39.663 "dd054684-ab4c-4233-89f8-cb89e56fa65c" 00:16:39.663 ], 00:16:39.663 "product_name": "Malloc disk", 00:16:39.663 "block_size": 512, 00:16:39.663 "num_blocks": 65536, 00:16:39.663 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:39.663 "assigned_rate_limits": { 00:16:39.663 "rw_ios_per_sec": 0, 00:16:39.663 "rw_mbytes_per_sec": 0, 00:16:39.663 "r_mbytes_per_sec": 0, 00:16:39.663 "w_mbytes_per_sec": 0 00:16:39.663 }, 00:16:39.663 "claimed": false, 00:16:39.663 "zoned": false, 00:16:39.663 "supported_io_types": { 00:16:39.663 "read": true, 00:16:39.663 "write": true, 00:16:39.663 "unmap": true, 00:16:39.663 "flush": true, 00:16:39.663 "reset": true, 00:16:39.663 "nvme_admin": false, 00:16:39.663 "nvme_io": false, 00:16:39.663 "nvme_io_md": false, 00:16:39.663 "write_zeroes": true, 00:16:39.663 "zcopy": true, 00:16:39.663 "get_zone_info": false, 00:16:39.663 "zone_management": false, 00:16:39.663 "zone_append": false, 00:16:39.663 "compare": false, 00:16:39.663 "compare_and_write": false, 00:16:39.663 "abort": true, 00:16:39.663 "seek_hole": false, 00:16:39.663 "seek_data": false, 00:16:39.663 "copy": true, 00:16:39.663 "nvme_iov_md": false 00:16:39.663 }, 00:16:39.663 "memory_domains": [ 00:16:39.663 { 00:16:39.663 "dma_device_id": "system", 00:16:39.663 "dma_device_type": 1 00:16:39.663 }, 00:16:39.663 { 00:16:39.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.663 "dma_device_type": 2 00:16:39.663 } 00:16:39.663 ], 00:16:39.663 "driver_specific": {} 00:16:39.663 } 00:16:39.663 ] 00:16:39.663 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:39.663 13:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:39.663 13:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:39.663 13:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:39.923 BaseBdev3 00:16:39.923 13:25:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:39.923 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:16:39.923 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:39.923 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:39.924 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:39.924 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:39.924 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:40.183 13:25:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:40.442 [ 00:16:40.442 { 00:16:40.442 "name": "BaseBdev3", 00:16:40.442 "aliases": [ 00:16:40.443 "c95510f1-8042-44d7-86b7-90b4ab9feaeb" 00:16:40.443 ], 00:16:40.443 "product_name": "Malloc disk", 00:16:40.443 "block_size": 512, 00:16:40.443 "num_blocks": 65536, 00:16:40.443 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:40.443 "assigned_rate_limits": { 00:16:40.443 "rw_ios_per_sec": 0, 00:16:40.443 "rw_mbytes_per_sec": 0, 00:16:40.443 "r_mbytes_per_sec": 0, 00:16:40.443 "w_mbytes_per_sec": 0 00:16:40.443 }, 00:16:40.443 "claimed": false, 00:16:40.443 "zoned": false, 00:16:40.443 "supported_io_types": { 00:16:40.443 "read": true, 00:16:40.443 "write": true, 00:16:40.443 "unmap": true, 00:16:40.443 "flush": true, 00:16:40.443 "reset": true, 00:16:40.443 "nvme_admin": false, 00:16:40.443 "nvme_io": false, 00:16:40.443 "nvme_io_md": false, 00:16:40.443 "write_zeroes": true, 00:16:40.443 "zcopy": true, 00:16:40.443 "get_zone_info": false, 00:16:40.443 "zone_management": false, 00:16:40.443 "zone_append": false, 00:16:40.443 "compare": false, 00:16:40.443 "compare_and_write": false, 00:16:40.443 "abort": true, 00:16:40.443 "seek_hole": false, 00:16:40.443 "seek_data": false, 00:16:40.443 "copy": true, 00:16:40.443 "nvme_iov_md": false 00:16:40.443 }, 00:16:40.443 "memory_domains": [ 00:16:40.443 { 00:16:40.443 "dma_device_id": "system", 00:16:40.443 "dma_device_type": 1 00:16:40.443 }, 00:16:40.443 { 00:16:40.443 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:40.443 "dma_device_type": 2 00:16:40.443 } 00:16:40.443 ], 00:16:40.443 "driver_specific": {} 00:16:40.443 } 00:16:40.443 ] 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:16:40.443 [2024-07-25 13:25:21.182383] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:40.443 [2024-07-25 13:25:21.182413] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:40.443 [2024-07-25 13:25:21.182425] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:40.443 [2024-07-25 13:25:21.183466] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.443 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.703 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.703 "name": "Existed_Raid", 00:16:40.703 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:40.703 "strip_size_kb": 0, 00:16:40.703 "state": "configuring", 00:16:40.703 "raid_level": "raid1", 00:16:40.703 "superblock": true, 00:16:40.703 "num_base_bdevs": 3, 00:16:40.703 "num_base_bdevs_discovered": 2, 00:16:40.703 "num_base_bdevs_operational": 3, 00:16:40.703 "base_bdevs_list": [ 00:16:40.703 { 00:16:40.703 "name": "BaseBdev1", 00:16:40.703 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.703 "is_configured": false, 00:16:40.703 "data_offset": 0, 00:16:40.703 "data_size": 0 00:16:40.703 }, 00:16:40.703 { 00:16:40.703 "name": "BaseBdev2", 00:16:40.703 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:40.703 "is_configured": true, 00:16:40.703 "data_offset": 2048, 00:16:40.703 "data_size": 63488 00:16:40.703 }, 00:16:40.703 { 00:16:40.703 "name": "BaseBdev3", 00:16:40.703 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:40.703 "is_configured": true, 00:16:40.703 "data_offset": 2048, 00:16:40.703 "data_size": 63488 00:16:40.703 } 00:16:40.703 ] 00:16:40.703 }' 00:16:40.703 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.703 13:25:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:41.272 13:25:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:41.531 [2024-07-25 13:25:22.092649] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.532 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:41.791 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:41.791 "name": "Existed_Raid", 00:16:41.791 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:41.791 "strip_size_kb": 0, 00:16:41.791 "state": "configuring", 00:16:41.791 "raid_level": "raid1", 00:16:41.791 "superblock": true, 00:16:41.791 "num_base_bdevs": 3, 00:16:41.791 "num_base_bdevs_discovered": 1, 00:16:41.791 "num_base_bdevs_operational": 3, 00:16:41.791 "base_bdevs_list": [ 00:16:41.791 { 00:16:41.791 "name": "BaseBdev1", 00:16:41.791 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:41.791 "is_configured": false, 00:16:41.791 "data_offset": 0, 00:16:41.791 "data_size": 0 00:16:41.791 }, 00:16:41.791 { 00:16:41.791 "name": null, 00:16:41.791 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:41.791 "is_configured": false, 00:16:41.791 "data_offset": 2048, 00:16:41.791 "data_size": 63488 00:16:41.791 }, 00:16:41.791 { 00:16:41.791 "name": "BaseBdev3", 00:16:41.791 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:41.791 "is_configured": true, 00:16:41.791 "data_offset": 2048, 00:16:41.791 "data_size": 63488 00:16:41.791 } 00:16:41.791 ] 00:16:41.791 }' 00:16:41.791 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:41.791 13:25:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:42.051 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.051 13:25:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:42.310 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:42.310 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:42.570 [2024-07-25 13:25:23.208267] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:42.570 BaseBdev1 00:16:42.570 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:42.570 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:16:42.570 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:42.570 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:42.570 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:42.570 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:42.570 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:42.830 [ 00:16:42.830 { 00:16:42.830 "name": "BaseBdev1", 00:16:42.830 "aliases": [ 00:16:42.830 "a3b3735a-c24d-4ba3-8380-4154a1db2ae7" 00:16:42.830 ], 00:16:42.830 "product_name": "Malloc disk", 00:16:42.830 "block_size": 512, 00:16:42.830 "num_blocks": 65536, 00:16:42.830 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:42.830 "assigned_rate_limits": { 00:16:42.830 "rw_ios_per_sec": 0, 00:16:42.830 "rw_mbytes_per_sec": 0, 00:16:42.830 "r_mbytes_per_sec": 0, 00:16:42.830 "w_mbytes_per_sec": 0 00:16:42.830 }, 00:16:42.830 "claimed": true, 00:16:42.830 "claim_type": "exclusive_write", 00:16:42.830 "zoned": false, 00:16:42.830 "supported_io_types": { 00:16:42.830 "read": true, 00:16:42.830 "write": true, 00:16:42.830 "unmap": true, 00:16:42.830 "flush": true, 00:16:42.830 "reset": true, 00:16:42.830 "nvme_admin": false, 00:16:42.830 "nvme_io": false, 00:16:42.830 "nvme_io_md": false, 00:16:42.830 "write_zeroes": true, 00:16:42.830 "zcopy": true, 00:16:42.830 "get_zone_info": false, 00:16:42.830 "zone_management": false, 00:16:42.830 "zone_append": false, 00:16:42.830 "compare": false, 00:16:42.830 "compare_and_write": false, 00:16:42.830 "abort": true, 00:16:42.830 "seek_hole": false, 00:16:42.830 "seek_data": false, 00:16:42.830 "copy": true, 00:16:42.830 "nvme_iov_md": false 00:16:42.830 }, 00:16:42.830 "memory_domains": [ 00:16:42.830 { 00:16:42.830 "dma_device_id": "system", 00:16:42.830 "dma_device_type": 1 00:16:42.830 }, 00:16:42.830 { 00:16:42.830 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.830 "dma_device_type": 2 00:16:42.830 } 00:16:42.830 ], 00:16:42.830 "driver_specific": {} 00:16:42.830 } 00:16:42.830 ] 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.830 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:43.089 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:43.089 "name": "Existed_Raid", 00:16:43.089 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:43.089 "strip_size_kb": 0, 00:16:43.089 "state": "configuring", 00:16:43.089 "raid_level": "raid1", 00:16:43.089 "superblock": true, 00:16:43.089 "num_base_bdevs": 3, 00:16:43.089 "num_base_bdevs_discovered": 2, 00:16:43.089 "num_base_bdevs_operational": 3, 00:16:43.089 "base_bdevs_list": [ 00:16:43.089 { 00:16:43.089 "name": "BaseBdev1", 00:16:43.089 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:43.089 "is_configured": true, 00:16:43.089 "data_offset": 2048, 00:16:43.089 "data_size": 63488 00:16:43.089 }, 00:16:43.089 { 00:16:43.089 "name": null, 00:16:43.089 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:43.089 "is_configured": false, 00:16:43.089 "data_offset": 2048, 00:16:43.089 "data_size": 63488 00:16:43.089 }, 00:16:43.089 { 00:16:43.089 "name": "BaseBdev3", 00:16:43.089 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:43.089 "is_configured": true, 00:16:43.089 "data_offset": 2048, 00:16:43.089 "data_size": 63488 00:16:43.089 } 00:16:43.089 ] 00:16:43.089 }' 00:16:43.089 13:25:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:43.089 13:25:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:43.658 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.659 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:43.918 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:43.919 [2024-07-25 13:25:24.672010] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.919 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.178 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.178 "name": "Existed_Raid", 00:16:44.178 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:44.178 "strip_size_kb": 0, 00:16:44.178 "state": "configuring", 00:16:44.178 "raid_level": "raid1", 00:16:44.178 "superblock": true, 00:16:44.178 "num_base_bdevs": 3, 00:16:44.178 "num_base_bdevs_discovered": 1, 00:16:44.178 "num_base_bdevs_operational": 3, 00:16:44.178 "base_bdevs_list": [ 00:16:44.178 { 00:16:44.178 "name": "BaseBdev1", 00:16:44.178 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:44.178 "is_configured": true, 00:16:44.178 "data_offset": 2048, 00:16:44.178 "data_size": 63488 00:16:44.178 }, 00:16:44.178 { 00:16:44.178 "name": null, 00:16:44.178 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:44.178 "is_configured": false, 00:16:44.178 "data_offset": 2048, 00:16:44.178 "data_size": 63488 00:16:44.178 }, 00:16:44.178 { 00:16:44.178 "name": null, 00:16:44.178 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:44.178 "is_configured": false, 00:16:44.178 "data_offset": 2048, 00:16:44.178 "data_size": 63488 00:16:44.178 } 00:16:44.178 ] 00:16:44.178 }' 00:16:44.178 13:25:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.178 13:25:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:44.747 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:44.747 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:45.006 [2024-07-25 13:25:25.734717] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.006 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.266 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.266 "name": "Existed_Raid", 00:16:45.266 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:45.266 "strip_size_kb": 0, 00:16:45.266 "state": "configuring", 00:16:45.266 "raid_level": "raid1", 00:16:45.266 "superblock": true, 00:16:45.266 "num_base_bdevs": 3, 00:16:45.266 "num_base_bdevs_discovered": 2, 00:16:45.266 "num_base_bdevs_operational": 3, 00:16:45.266 "base_bdevs_list": [ 00:16:45.266 { 00:16:45.266 "name": "BaseBdev1", 00:16:45.266 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:45.266 "is_configured": true, 00:16:45.266 "data_offset": 2048, 00:16:45.266 "data_size": 63488 00:16:45.266 }, 00:16:45.266 { 00:16:45.266 "name": null, 00:16:45.266 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:45.266 "is_configured": false, 00:16:45.266 "data_offset": 2048, 00:16:45.266 "data_size": 63488 00:16:45.266 }, 00:16:45.266 { 00:16:45.266 "name": "BaseBdev3", 00:16:45.266 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:45.266 "is_configured": true, 00:16:45.266 "data_offset": 2048, 00:16:45.266 "data_size": 63488 00:16:45.266 } 00:16:45.266 ] 00:16:45.266 }' 00:16:45.266 13:25:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.266 13:25:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:45.835 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.835 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:46.094 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:46.094 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:46.095 [2024-07-25 13:25:26.865598] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.095 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.354 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.354 13:25:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.354 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.354 "name": "Existed_Raid", 00:16:46.354 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:46.354 "strip_size_kb": 0, 00:16:46.354 "state": "configuring", 00:16:46.354 "raid_level": "raid1", 00:16:46.354 "superblock": true, 00:16:46.354 "num_base_bdevs": 3, 00:16:46.354 "num_base_bdevs_discovered": 1, 00:16:46.354 "num_base_bdevs_operational": 3, 00:16:46.354 "base_bdevs_list": [ 00:16:46.354 { 00:16:46.354 "name": null, 00:16:46.354 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:46.354 "is_configured": false, 00:16:46.354 "data_offset": 2048, 00:16:46.354 "data_size": 63488 00:16:46.354 }, 00:16:46.354 { 00:16:46.354 "name": null, 00:16:46.354 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:46.354 "is_configured": false, 00:16:46.354 "data_offset": 2048, 00:16:46.354 "data_size": 63488 00:16:46.354 }, 00:16:46.354 { 00:16:46.354 "name": "BaseBdev3", 00:16:46.354 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:46.354 "is_configured": true, 00:16:46.354 "data_offset": 2048, 00:16:46.354 "data_size": 63488 00:16:46.355 } 00:16:46.355 ] 00:16:46.355 }' 00:16:46.355 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.355 13:25:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:46.923 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.923 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:47.183 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:47.183 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:47.443 [2024-07-25 13:25:27.986310] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:47.443 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:16:47.443 13:25:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:47.443 "name": "Existed_Raid", 00:16:47.443 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:47.443 "strip_size_kb": 0, 00:16:47.443 "state": "configuring", 00:16:47.443 "raid_level": "raid1", 00:16:47.443 "superblock": true, 00:16:47.443 "num_base_bdevs": 3, 00:16:47.443 "num_base_bdevs_discovered": 2, 00:16:47.443 "num_base_bdevs_operational": 3, 00:16:47.443 "base_bdevs_list": [ 00:16:47.443 { 00:16:47.443 "name": null, 00:16:47.443 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:47.443 "is_configured": false, 00:16:47.443 "data_offset": 2048, 00:16:47.443 "data_size": 63488 00:16:47.443 }, 00:16:47.443 { 00:16:47.443 "name": "BaseBdev2", 00:16:47.443 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:47.443 "is_configured": true, 00:16:47.443 "data_offset": 2048, 00:16:47.443 "data_size": 63488 00:16:47.443 }, 00:16:47.443 { 00:16:47.443 "name": "BaseBdev3", 00:16:47.443 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:47.443 "is_configured": true, 00:16:47.443 "data_offset": 2048, 00:16:47.443 "data_size": 63488 00:16:47.443 } 00:16:47.443 ] 00:16:47.443 }' 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:47.443 13:25:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:48.011 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.012 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:48.285 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:48.286 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.286 13:25:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u a3b3735a-c24d-4ba3-8380-4154a1db2ae7 00:16:48.548 [2024-07-25 13:25:29.326717] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:48.548 [2024-07-25 13:25:29.326827] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x24293e0 00:16:48.548 [2024-07-25 13:25:29.326835] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:48.548 [2024-07-25 13:25:29.326968] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x242c4a0 00:16:48.548 [2024-07-25 13:25:29.327062] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x24293e0 00:16:48.548 [2024-07-25 13:25:29.327068] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x24293e0 00:16:48.548 [2024-07-25 13:25:29.327135] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:48.548 NewBaseBdev 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:16:48.548 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:48.807 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:49.067 [ 00:16:49.067 { 00:16:49.067 "name": "NewBaseBdev", 00:16:49.067 "aliases": [ 00:16:49.067 "a3b3735a-c24d-4ba3-8380-4154a1db2ae7" 00:16:49.067 ], 00:16:49.067 "product_name": "Malloc disk", 00:16:49.067 "block_size": 512, 00:16:49.067 "num_blocks": 65536, 00:16:49.067 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:49.067 "assigned_rate_limits": { 00:16:49.067 "rw_ios_per_sec": 0, 00:16:49.067 "rw_mbytes_per_sec": 0, 00:16:49.067 "r_mbytes_per_sec": 0, 00:16:49.067 "w_mbytes_per_sec": 0 00:16:49.067 }, 00:16:49.067 "claimed": true, 00:16:49.067 "claim_type": "exclusive_write", 00:16:49.067 "zoned": false, 00:16:49.067 "supported_io_types": { 00:16:49.067 "read": true, 00:16:49.067 "write": true, 00:16:49.067 "unmap": true, 00:16:49.067 "flush": true, 00:16:49.067 "reset": true, 00:16:49.067 "nvme_admin": false, 00:16:49.067 "nvme_io": false, 00:16:49.067 "nvme_io_md": false, 00:16:49.067 "write_zeroes": true, 00:16:49.067 "zcopy": true, 00:16:49.067 "get_zone_info": false, 00:16:49.067 "zone_management": false, 00:16:49.067 "zone_append": false, 00:16:49.067 "compare": false, 00:16:49.067 "compare_and_write": false, 00:16:49.067 "abort": true, 00:16:49.067 "seek_hole": false, 00:16:49.067 "seek_data": false, 00:16:49.067 "copy": true, 00:16:49.067 "nvme_iov_md": false 00:16:49.067 }, 00:16:49.067 "memory_domains": [ 00:16:49.067 { 00:16:49.067 "dma_device_id": "system", 00:16:49.067 "dma_device_type": 1 00:16:49.067 }, 00:16:49.067 { 00:16:49.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.067 "dma_device_type": 2 00:16:49.067 } 00:16:49.067 ], 00:16:49.067 "driver_specific": {} 00:16:49.067 } 00:16:49.067 ] 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.067 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:49.326 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:49.326 "name": "Existed_Raid", 00:16:49.326 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:49.326 "strip_size_kb": 0, 00:16:49.326 "state": "online", 00:16:49.326 "raid_level": "raid1", 00:16:49.326 "superblock": true, 00:16:49.326 "num_base_bdevs": 3, 00:16:49.326 "num_base_bdevs_discovered": 3, 00:16:49.326 "num_base_bdevs_operational": 3, 00:16:49.326 "base_bdevs_list": [ 00:16:49.326 { 00:16:49.326 "name": "NewBaseBdev", 00:16:49.326 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:49.326 "is_configured": true, 00:16:49.326 "data_offset": 2048, 00:16:49.326 "data_size": 63488 00:16:49.326 }, 00:16:49.326 { 00:16:49.326 "name": "BaseBdev2", 00:16:49.326 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:49.326 "is_configured": true, 00:16:49.326 "data_offset": 2048, 00:16:49.326 "data_size": 63488 00:16:49.326 }, 00:16:49.326 { 00:16:49.326 "name": "BaseBdev3", 00:16:49.326 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:49.326 "is_configured": true, 00:16:49.326 "data_offset": 2048, 00:16:49.326 "data_size": 63488 00:16:49.326 } 00:16:49.326 ] 00:16:49.326 }' 00:16:49.326 13:25:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:49.326 13:25:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:49.895 [2024-07-25 13:25:30.638279] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:49.895 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:49.895 "name": "Existed_Raid", 00:16:49.895 "aliases": [ 00:16:49.895 "837472aa-00c5-43ba-bc81-10ca7b30598d" 00:16:49.895 ], 00:16:49.895 "product_name": "Raid Volume", 00:16:49.895 "block_size": 512, 00:16:49.895 "num_blocks": 63488, 00:16:49.895 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:49.895 "assigned_rate_limits": { 00:16:49.895 "rw_ios_per_sec": 0, 00:16:49.895 "rw_mbytes_per_sec": 0, 00:16:49.895 "r_mbytes_per_sec": 0, 00:16:49.895 "w_mbytes_per_sec": 0 00:16:49.895 }, 00:16:49.895 "claimed": false, 00:16:49.895 "zoned": false, 00:16:49.895 "supported_io_types": { 00:16:49.895 "read": true, 00:16:49.895 "write": true, 00:16:49.895 "unmap": false, 00:16:49.895 "flush": false, 00:16:49.895 "reset": true, 00:16:49.895 "nvme_admin": false, 00:16:49.895 "nvme_io": false, 00:16:49.895 "nvme_io_md": false, 00:16:49.895 "write_zeroes": true, 00:16:49.895 "zcopy": false, 00:16:49.895 "get_zone_info": false, 00:16:49.895 "zone_management": false, 00:16:49.895 "zone_append": false, 00:16:49.895 "compare": false, 00:16:49.895 "compare_and_write": false, 00:16:49.895 "abort": false, 00:16:49.895 "seek_hole": false, 00:16:49.895 "seek_data": false, 00:16:49.895 "copy": false, 00:16:49.895 "nvme_iov_md": false 00:16:49.895 }, 00:16:49.895 "memory_domains": [ 00:16:49.895 { 00:16:49.895 "dma_device_id": "system", 00:16:49.895 "dma_device_type": 1 00:16:49.895 }, 00:16:49.895 { 00:16:49.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.895 "dma_device_type": 2 00:16:49.895 }, 00:16:49.895 { 00:16:49.895 "dma_device_id": "system", 00:16:49.895 "dma_device_type": 1 00:16:49.895 }, 00:16:49.895 { 00:16:49.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.895 "dma_device_type": 2 00:16:49.895 }, 00:16:49.895 { 00:16:49.895 "dma_device_id": "system", 00:16:49.895 "dma_device_type": 1 00:16:49.895 }, 00:16:49.895 { 00:16:49.895 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:49.895 "dma_device_type": 2 00:16:49.895 } 00:16:49.895 ], 00:16:49.895 "driver_specific": { 00:16:49.895 "raid": { 00:16:49.895 "uuid": "837472aa-00c5-43ba-bc81-10ca7b30598d", 00:16:49.895 "strip_size_kb": 0, 00:16:49.895 "state": "online", 00:16:49.895 "raid_level": "raid1", 00:16:49.895 "superblock": true, 00:16:49.895 "num_base_bdevs": 3, 00:16:49.896 "num_base_bdevs_discovered": 3, 00:16:49.896 "num_base_bdevs_operational": 3, 00:16:49.896 "base_bdevs_list": [ 00:16:49.896 { 00:16:49.896 "name": "NewBaseBdev", 00:16:49.896 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:49.896 "is_configured": true, 00:16:49.896 "data_offset": 2048, 00:16:49.896 "data_size": 63488 00:16:49.896 }, 00:16:49.896 { 00:16:49.896 "name": "BaseBdev2", 00:16:49.896 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:49.896 "is_configured": true, 00:16:49.896 "data_offset": 2048, 00:16:49.896 "data_size": 63488 00:16:49.896 }, 00:16:49.896 { 00:16:49.896 "name": "BaseBdev3", 00:16:49.896 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:49.896 "is_configured": true, 00:16:49.896 "data_offset": 2048, 00:16:49.896 "data_size": 63488 00:16:49.896 } 00:16:49.896 ] 00:16:49.896 } 00:16:49.896 } 00:16:49.896 }' 00:16:49.896 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:50.155 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:50.155 BaseBdev2 00:16:50.155 BaseBdev3' 00:16:50.155 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:50.155 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:50.155 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:50.155 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:50.155 "name": "NewBaseBdev", 00:16:50.155 "aliases": [ 00:16:50.155 "a3b3735a-c24d-4ba3-8380-4154a1db2ae7" 00:16:50.155 ], 00:16:50.155 "product_name": "Malloc disk", 00:16:50.155 "block_size": 512, 00:16:50.155 "num_blocks": 65536, 00:16:50.155 "uuid": "a3b3735a-c24d-4ba3-8380-4154a1db2ae7", 00:16:50.155 "assigned_rate_limits": { 00:16:50.155 "rw_ios_per_sec": 0, 00:16:50.155 "rw_mbytes_per_sec": 0, 00:16:50.155 "r_mbytes_per_sec": 0, 00:16:50.155 "w_mbytes_per_sec": 0 00:16:50.155 }, 00:16:50.155 "claimed": true, 00:16:50.155 "claim_type": "exclusive_write", 00:16:50.155 "zoned": false, 00:16:50.155 "supported_io_types": { 00:16:50.155 "read": true, 00:16:50.155 "write": true, 00:16:50.155 "unmap": true, 00:16:50.155 "flush": true, 00:16:50.155 "reset": true, 00:16:50.155 "nvme_admin": false, 00:16:50.155 "nvme_io": false, 00:16:50.155 "nvme_io_md": false, 00:16:50.155 "write_zeroes": true, 00:16:50.155 "zcopy": true, 00:16:50.155 "get_zone_info": false, 00:16:50.155 "zone_management": false, 00:16:50.155 "zone_append": false, 00:16:50.155 "compare": false, 00:16:50.155 "compare_and_write": false, 00:16:50.155 "abort": true, 00:16:50.155 "seek_hole": false, 00:16:50.155 "seek_data": false, 00:16:50.155 "copy": true, 00:16:50.155 "nvme_iov_md": false 00:16:50.155 }, 00:16:50.155 "memory_domains": [ 00:16:50.155 { 00:16:50.155 "dma_device_id": "system", 00:16:50.155 "dma_device_type": 1 00:16:50.155 }, 00:16:50.155 { 00:16:50.155 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.155 "dma_device_type": 2 00:16:50.155 } 00:16:50.155 ], 00:16:50.155 "driver_specific": {} 00:16:50.155 }' 00:16:50.155 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.155 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.414 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:50.414 13:25:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.414 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.414 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:50.414 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.414 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.414 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:50.415 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.674 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:50.674 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:50.674 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:50.674 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:50.674 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:50.674 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:50.674 "name": "BaseBdev2", 00:16:50.674 "aliases": [ 00:16:50.674 "dd054684-ab4c-4233-89f8-cb89e56fa65c" 00:16:50.674 ], 00:16:50.674 "product_name": "Malloc disk", 00:16:50.674 "block_size": 512, 00:16:50.674 "num_blocks": 65536, 00:16:50.674 "uuid": "dd054684-ab4c-4233-89f8-cb89e56fa65c", 00:16:50.674 "assigned_rate_limits": { 00:16:50.674 "rw_ios_per_sec": 0, 00:16:50.674 "rw_mbytes_per_sec": 0, 00:16:50.674 "r_mbytes_per_sec": 0, 00:16:50.674 "w_mbytes_per_sec": 0 00:16:50.674 }, 00:16:50.674 "claimed": true, 00:16:50.674 "claim_type": "exclusive_write", 00:16:50.674 "zoned": false, 00:16:50.674 "supported_io_types": { 00:16:50.674 "read": true, 00:16:50.674 "write": true, 00:16:50.674 "unmap": true, 00:16:50.674 "flush": true, 00:16:50.674 "reset": true, 00:16:50.674 "nvme_admin": false, 00:16:50.674 "nvme_io": false, 00:16:50.674 "nvme_io_md": false, 00:16:50.674 "write_zeroes": true, 00:16:50.674 "zcopy": true, 00:16:50.674 "get_zone_info": false, 00:16:50.674 "zone_management": false, 00:16:50.674 "zone_append": false, 00:16:50.674 "compare": false, 00:16:50.674 "compare_and_write": false, 00:16:50.674 "abort": true, 00:16:50.674 "seek_hole": false, 00:16:50.674 "seek_data": false, 00:16:50.674 "copy": true, 00:16:50.674 "nvme_iov_md": false 00:16:50.674 }, 00:16:50.674 "memory_domains": [ 00:16:50.674 { 00:16:50.674 "dma_device_id": "system", 00:16:50.674 "dma_device_type": 1 00:16:50.674 }, 00:16:50.674 { 00:16:50.674 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.674 "dma_device_type": 2 00:16:50.674 } 00:16:50.674 ], 00:16:50.674 "driver_specific": {} 00:16:50.674 }' 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:50.934 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.193 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.193 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.193 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.193 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.193 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.193 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:51.193 13:25:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.453 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.453 "name": "BaseBdev3", 00:16:51.453 "aliases": [ 00:16:51.453 "c95510f1-8042-44d7-86b7-90b4ab9feaeb" 00:16:51.453 ], 00:16:51.453 "product_name": "Malloc disk", 00:16:51.453 "block_size": 512, 00:16:51.453 "num_blocks": 65536, 00:16:51.453 "uuid": "c95510f1-8042-44d7-86b7-90b4ab9feaeb", 00:16:51.453 "assigned_rate_limits": { 00:16:51.453 "rw_ios_per_sec": 0, 00:16:51.453 "rw_mbytes_per_sec": 0, 00:16:51.453 "r_mbytes_per_sec": 0, 00:16:51.453 "w_mbytes_per_sec": 0 00:16:51.453 }, 00:16:51.453 "claimed": true, 00:16:51.453 "claim_type": "exclusive_write", 00:16:51.453 "zoned": false, 00:16:51.453 "supported_io_types": { 00:16:51.453 "read": true, 00:16:51.453 "write": true, 00:16:51.453 "unmap": true, 00:16:51.453 "flush": true, 00:16:51.453 "reset": true, 00:16:51.453 "nvme_admin": false, 00:16:51.453 "nvme_io": false, 00:16:51.453 "nvme_io_md": false, 00:16:51.453 "write_zeroes": true, 00:16:51.453 "zcopy": true, 00:16:51.453 "get_zone_info": false, 00:16:51.454 "zone_management": false, 00:16:51.454 "zone_append": false, 00:16:51.454 "compare": false, 00:16:51.454 "compare_and_write": false, 00:16:51.454 "abort": true, 00:16:51.454 "seek_hole": false, 00:16:51.454 "seek_data": false, 00:16:51.454 "copy": true, 00:16:51.454 "nvme_iov_md": false 00:16:51.454 }, 00:16:51.454 "memory_domains": [ 00:16:51.454 { 00:16:51.454 "dma_device_id": "system", 00:16:51.454 "dma_device_type": 1 00:16:51.454 }, 00:16:51.454 { 00:16:51.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.454 "dma_device_type": 2 00:16:51.454 } 00:16:51.454 ], 00:16:51.454 "driver_specific": {} 00:16:51.454 }' 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.454 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.713 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.713 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.713 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.713 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.713 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:51.973 [2024-07-25 13:25:32.530879] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:51.973 [2024-07-25 13:25:32.530898] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:51.973 [2024-07-25 13:25:32.530934] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:51.973 [2024-07-25 13:25:32.531138] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:51.973 [2024-07-25 13:25:32.531145] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24293e0 name Existed_Raid, state offline 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 930932 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 930932 ']' 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 930932 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 930932 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 930932' 00:16:51.973 killing process with pid 930932 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 930932 00:16:51.973 [2024-07-25 13:25:32.599000] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 930932 00:16:51.973 [2024-07-25 13:25:32.613853] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:16:51.973 00:16:51.973 real 0m24.109s 00:16:51.973 user 0m45.183s 00:16:51.973 sys 0m3.592s 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:16:51.973 13:25:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:51.973 ************************************ 00:16:51.973 END TEST raid_state_function_test_sb 00:16:51.973 ************************************ 00:16:52.233 13:25:32 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:16:52.233 13:25:32 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:16:52.233 13:25:32 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:16:52.233 13:25:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:52.233 ************************************ 00:16:52.233 START TEST raid_superblock_test 00:16:52.233 ************************************ 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 3 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=3 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=935505 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 935505 /var/tmp/spdk-raid.sock 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 935505 ']' 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:52.233 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:16:52.233 13:25:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:52.233 [2024-07-25 13:25:32.866807] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:16:52.233 [2024-07-25 13:25:32.866856] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid935505 ] 00:16:52.233 [2024-07-25 13:25:32.956109] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:52.233 [2024-07-25 13:25:33.021048] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:16:52.492 [2024-07-25 13:25:33.062616] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:52.492 [2024-07-25 13:25:33.062639] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:53.062 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:16:53.322 malloc1 00:16:53.322 13:25:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:53.322 [2024-07-25 13:25:34.061409] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:53.322 [2024-07-25 13:25:34.061443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:53.322 [2024-07-25 13:25:34.061453] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x124c9b0 00:16:53.322 [2024-07-25 13:25:34.061460] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:53.322 [2024-07-25 13:25:34.062735] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:53.322 [2024-07-25 13:25:34.062755] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:53.322 pt1 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:53.322 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:16:53.582 malloc2 00:16:53.582 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:53.843 [2024-07-25 13:25:34.448451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:53.843 [2024-07-25 13:25:34.448482] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:53.843 [2024-07-25 13:25:34.448491] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x124ddb0 00:16:53.843 [2024-07-25 13:25:34.448502] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:53.843 [2024-07-25 13:25:34.449729] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:53.843 [2024-07-25 13:25:34.449749] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:53.843 pt2 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:16:53.843 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:16:54.102 malloc3 00:16:54.102 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:54.102 [2024-07-25 13:25:34.823289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:54.102 [2024-07-25 13:25:34.823318] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:54.102 [2024-07-25 13:25:34.823327] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13e4780 00:16:54.102 [2024-07-25 13:25:34.823333] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:54.102 [2024-07-25 13:25:34.824510] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:54.102 [2024-07-25 13:25:34.824529] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:54.102 pt3 00:16:54.102 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:16:54.102 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:16:54.102 13:25:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:16:54.363 [2024-07-25 13:25:35.015787] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:54.363 [2024-07-25 13:25:35.016790] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:54.363 [2024-07-25 13:25:35.016831] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:54.363 [2024-07-25 13:25:35.016940] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12452e0 00:16:54.363 [2024-07-25 13:25:35.016947] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:54.363 [2024-07-25 13:25:35.017102] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x124c680 00:16:54.363 [2024-07-25 13:25:35.017213] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12452e0 00:16:54.363 [2024-07-25 13:25:35.017218] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12452e0 00:16:54.363 [2024-07-25 13:25:35.017299] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:54.363 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:54.623 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:54.623 "name": "raid_bdev1", 00:16:54.623 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:16:54.623 "strip_size_kb": 0, 00:16:54.623 "state": "online", 00:16:54.623 "raid_level": "raid1", 00:16:54.623 "superblock": true, 00:16:54.623 "num_base_bdevs": 3, 00:16:54.623 "num_base_bdevs_discovered": 3, 00:16:54.623 "num_base_bdevs_operational": 3, 00:16:54.623 "base_bdevs_list": [ 00:16:54.623 { 00:16:54.623 "name": "pt1", 00:16:54.623 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:54.623 "is_configured": true, 00:16:54.623 "data_offset": 2048, 00:16:54.623 "data_size": 63488 00:16:54.623 }, 00:16:54.623 { 00:16:54.623 "name": "pt2", 00:16:54.623 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:54.623 "is_configured": true, 00:16:54.623 "data_offset": 2048, 00:16:54.623 "data_size": 63488 00:16:54.623 }, 00:16:54.623 { 00:16:54.623 "name": "pt3", 00:16:54.623 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:54.623 "is_configured": true, 00:16:54.623 "data_offset": 2048, 00:16:54.623 "data_size": 63488 00:16:54.623 } 00:16:54.623 ] 00:16:54.623 }' 00:16:54.623 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:54.623 13:25:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:55.193 [2024-07-25 13:25:35.954358] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:55.193 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:55.193 "name": "raid_bdev1", 00:16:55.193 "aliases": [ 00:16:55.193 "267000e3-6249-4a45-92fd-4ef03f31123c" 00:16:55.193 ], 00:16:55.193 "product_name": "Raid Volume", 00:16:55.193 "block_size": 512, 00:16:55.193 "num_blocks": 63488, 00:16:55.193 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:16:55.193 "assigned_rate_limits": { 00:16:55.193 "rw_ios_per_sec": 0, 00:16:55.193 "rw_mbytes_per_sec": 0, 00:16:55.193 "r_mbytes_per_sec": 0, 00:16:55.193 "w_mbytes_per_sec": 0 00:16:55.193 }, 00:16:55.193 "claimed": false, 00:16:55.193 "zoned": false, 00:16:55.193 "supported_io_types": { 00:16:55.193 "read": true, 00:16:55.193 "write": true, 00:16:55.193 "unmap": false, 00:16:55.193 "flush": false, 00:16:55.193 "reset": true, 00:16:55.194 "nvme_admin": false, 00:16:55.194 "nvme_io": false, 00:16:55.194 "nvme_io_md": false, 00:16:55.194 "write_zeroes": true, 00:16:55.194 "zcopy": false, 00:16:55.194 "get_zone_info": false, 00:16:55.194 "zone_management": false, 00:16:55.194 "zone_append": false, 00:16:55.194 "compare": false, 00:16:55.194 "compare_and_write": false, 00:16:55.194 "abort": false, 00:16:55.194 "seek_hole": false, 00:16:55.194 "seek_data": false, 00:16:55.194 "copy": false, 00:16:55.194 "nvme_iov_md": false 00:16:55.194 }, 00:16:55.194 "memory_domains": [ 00:16:55.194 { 00:16:55.194 "dma_device_id": "system", 00:16:55.194 "dma_device_type": 1 00:16:55.194 }, 00:16:55.194 { 00:16:55.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.194 "dma_device_type": 2 00:16:55.194 }, 00:16:55.194 { 00:16:55.194 "dma_device_id": "system", 00:16:55.194 "dma_device_type": 1 00:16:55.194 }, 00:16:55.194 { 00:16:55.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.194 "dma_device_type": 2 00:16:55.194 }, 00:16:55.194 { 00:16:55.194 "dma_device_id": "system", 00:16:55.194 "dma_device_type": 1 00:16:55.194 }, 00:16:55.194 { 00:16:55.194 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.194 "dma_device_type": 2 00:16:55.194 } 00:16:55.194 ], 00:16:55.194 "driver_specific": { 00:16:55.194 "raid": { 00:16:55.194 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:16:55.194 "strip_size_kb": 0, 00:16:55.194 "state": "online", 00:16:55.194 "raid_level": "raid1", 00:16:55.194 "superblock": true, 00:16:55.194 "num_base_bdevs": 3, 00:16:55.194 "num_base_bdevs_discovered": 3, 00:16:55.194 "num_base_bdevs_operational": 3, 00:16:55.194 "base_bdevs_list": [ 00:16:55.194 { 00:16:55.194 "name": "pt1", 00:16:55.194 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:55.194 "is_configured": true, 00:16:55.194 "data_offset": 2048, 00:16:55.194 "data_size": 63488 00:16:55.194 }, 00:16:55.194 { 00:16:55.194 "name": "pt2", 00:16:55.194 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:55.194 "is_configured": true, 00:16:55.194 "data_offset": 2048, 00:16:55.194 "data_size": 63488 00:16:55.194 }, 00:16:55.194 { 00:16:55.194 "name": "pt3", 00:16:55.194 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:55.194 "is_configured": true, 00:16:55.194 "data_offset": 2048, 00:16:55.194 "data_size": 63488 00:16:55.194 } 00:16:55.194 ] 00:16:55.194 } 00:16:55.194 } 00:16:55.194 }' 00:16:55.194 13:25:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:55.453 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:16:55.454 pt2 00:16:55.454 pt3' 00:16:55.454 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.454 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:16:55.454 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:55.454 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:55.454 "name": "pt1", 00:16:55.454 "aliases": [ 00:16:55.454 "00000000-0000-0000-0000-000000000001" 00:16:55.454 ], 00:16:55.454 "product_name": "passthru", 00:16:55.454 "block_size": 512, 00:16:55.454 "num_blocks": 65536, 00:16:55.454 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:55.454 "assigned_rate_limits": { 00:16:55.454 "rw_ios_per_sec": 0, 00:16:55.454 "rw_mbytes_per_sec": 0, 00:16:55.454 "r_mbytes_per_sec": 0, 00:16:55.454 "w_mbytes_per_sec": 0 00:16:55.454 }, 00:16:55.454 "claimed": true, 00:16:55.454 "claim_type": "exclusive_write", 00:16:55.454 "zoned": false, 00:16:55.454 "supported_io_types": { 00:16:55.454 "read": true, 00:16:55.454 "write": true, 00:16:55.454 "unmap": true, 00:16:55.454 "flush": true, 00:16:55.454 "reset": true, 00:16:55.454 "nvme_admin": false, 00:16:55.454 "nvme_io": false, 00:16:55.454 "nvme_io_md": false, 00:16:55.454 "write_zeroes": true, 00:16:55.454 "zcopy": true, 00:16:55.454 "get_zone_info": false, 00:16:55.454 "zone_management": false, 00:16:55.454 "zone_append": false, 00:16:55.454 "compare": false, 00:16:55.454 "compare_and_write": false, 00:16:55.454 "abort": true, 00:16:55.454 "seek_hole": false, 00:16:55.454 "seek_data": false, 00:16:55.454 "copy": true, 00:16:55.454 "nvme_iov_md": false 00:16:55.454 }, 00:16:55.454 "memory_domains": [ 00:16:55.454 { 00:16:55.454 "dma_device_id": "system", 00:16:55.454 "dma_device_type": 1 00:16:55.454 }, 00:16:55.454 { 00:16:55.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:55.454 "dma_device_type": 2 00:16:55.454 } 00:16:55.454 ], 00:16:55.454 "driver_specific": { 00:16:55.454 "passthru": { 00:16:55.454 "name": "pt1", 00:16:55.454 "base_bdev_name": "malloc1" 00:16:55.454 } 00:16:55.454 } 00:16:55.454 }' 00:16:55.454 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:55.714 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.973 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:55.973 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:55.973 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:55.973 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:16:55.973 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:56.234 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:56.234 "name": "pt2", 00:16:56.234 "aliases": [ 00:16:56.235 "00000000-0000-0000-0000-000000000002" 00:16:56.235 ], 00:16:56.235 "product_name": "passthru", 00:16:56.235 "block_size": 512, 00:16:56.235 "num_blocks": 65536, 00:16:56.235 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:56.235 "assigned_rate_limits": { 00:16:56.235 "rw_ios_per_sec": 0, 00:16:56.235 "rw_mbytes_per_sec": 0, 00:16:56.235 "r_mbytes_per_sec": 0, 00:16:56.235 "w_mbytes_per_sec": 0 00:16:56.235 }, 00:16:56.235 "claimed": true, 00:16:56.235 "claim_type": "exclusive_write", 00:16:56.235 "zoned": false, 00:16:56.235 "supported_io_types": { 00:16:56.235 "read": true, 00:16:56.235 "write": true, 00:16:56.235 "unmap": true, 00:16:56.235 "flush": true, 00:16:56.235 "reset": true, 00:16:56.235 "nvme_admin": false, 00:16:56.235 "nvme_io": false, 00:16:56.235 "nvme_io_md": false, 00:16:56.235 "write_zeroes": true, 00:16:56.235 "zcopy": true, 00:16:56.235 "get_zone_info": false, 00:16:56.235 "zone_management": false, 00:16:56.235 "zone_append": false, 00:16:56.235 "compare": false, 00:16:56.235 "compare_and_write": false, 00:16:56.235 "abort": true, 00:16:56.235 "seek_hole": false, 00:16:56.235 "seek_data": false, 00:16:56.235 "copy": true, 00:16:56.235 "nvme_iov_md": false 00:16:56.235 }, 00:16:56.235 "memory_domains": [ 00:16:56.235 { 00:16:56.235 "dma_device_id": "system", 00:16:56.235 "dma_device_type": 1 00:16:56.235 }, 00:16:56.235 { 00:16:56.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.235 "dma_device_type": 2 00:16:56.235 } 00:16:56.235 ], 00:16:56.235 "driver_specific": { 00:16:56.235 "passthru": { 00:16:56.235 "name": "pt2", 00:16:56.235 "base_bdev_name": "malloc2" 00:16:56.235 } 00:16:56.235 } 00:16:56.235 }' 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.235 13:25:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:56.496 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:56.496 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.496 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:56.496 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:56.496 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:56.496 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:56.496 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:16:56.756 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:56.756 "name": "pt3", 00:16:56.756 "aliases": [ 00:16:56.756 "00000000-0000-0000-0000-000000000003" 00:16:56.756 ], 00:16:56.756 "product_name": "passthru", 00:16:56.756 "block_size": 512, 00:16:56.756 "num_blocks": 65536, 00:16:56.756 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:56.756 "assigned_rate_limits": { 00:16:56.756 "rw_ios_per_sec": 0, 00:16:56.756 "rw_mbytes_per_sec": 0, 00:16:56.756 "r_mbytes_per_sec": 0, 00:16:56.756 "w_mbytes_per_sec": 0 00:16:56.756 }, 00:16:56.756 "claimed": true, 00:16:56.756 "claim_type": "exclusive_write", 00:16:56.756 "zoned": false, 00:16:56.756 "supported_io_types": { 00:16:56.756 "read": true, 00:16:56.756 "write": true, 00:16:56.756 "unmap": true, 00:16:56.756 "flush": true, 00:16:56.756 "reset": true, 00:16:56.756 "nvme_admin": false, 00:16:56.756 "nvme_io": false, 00:16:56.756 "nvme_io_md": false, 00:16:56.756 "write_zeroes": true, 00:16:56.756 "zcopy": true, 00:16:56.756 "get_zone_info": false, 00:16:56.756 "zone_management": false, 00:16:56.756 "zone_append": false, 00:16:56.756 "compare": false, 00:16:56.756 "compare_and_write": false, 00:16:56.756 "abort": true, 00:16:56.756 "seek_hole": false, 00:16:56.756 "seek_data": false, 00:16:56.756 "copy": true, 00:16:56.756 "nvme_iov_md": false 00:16:56.756 }, 00:16:56.756 "memory_domains": [ 00:16:56.756 { 00:16:56.756 "dma_device_id": "system", 00:16:56.756 "dma_device_type": 1 00:16:56.756 }, 00:16:56.756 { 00:16:56.756 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:56.756 "dma_device_type": 2 00:16:56.756 } 00:16:56.756 ], 00:16:56.756 "driver_specific": { 00:16:56.756 "passthru": { 00:16:56.757 "name": "pt3", 00:16:56.757 "base_bdev_name": "malloc3" 00:16:56.757 } 00:16:56.757 } 00:16:56.757 }' 00:16:56.757 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.757 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:56.757 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:56.757 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.757 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:56.757 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:56.757 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:57.018 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:57.018 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:57.018 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:57.018 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:57.018 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:57.018 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:16:57.018 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:57.279 [2024-07-25 13:25:37.867222] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:57.279 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=267000e3-6249-4a45-92fd-4ef03f31123c 00:16:57.279 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 267000e3-6249-4a45-92fd-4ef03f31123c ']' 00:16:57.279 13:25:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:57.279 [2024-07-25 13:25:38.059490] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:57.279 [2024-07-25 13:25:38.059502] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:57.279 [2024-07-25 13:25:38.059539] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:57.279 [2024-07-25 13:25:38.059592] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:57.279 [2024-07-25 13:25:38.059598] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12452e0 name raid_bdev1, state offline 00:16:57.555 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.555 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:16:57.555 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:16:57.555 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:16:57.555 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.555 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:16:57.851 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:57.851 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:58.116 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:16:58.116 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:58.116 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:16:58.116 13:25:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:16:58.376 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:16:58.637 [2024-07-25 13:25:39.218385] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:16:58.637 [2024-07-25 13:25:39.219448] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:16:58.637 [2024-07-25 13:25:39.219480] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:16:58.637 [2024-07-25 13:25:39.219515] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:16:58.637 [2024-07-25 13:25:39.219542] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:16:58.637 [2024-07-25 13:25:39.219565] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:16:58.637 [2024-07-25 13:25:39.219575] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:58.637 [2024-07-25 13:25:39.219581] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x124ce50 name raid_bdev1, state configuring 00:16:58.637 request: 00:16:58.637 { 00:16:58.637 "name": "raid_bdev1", 00:16:58.637 "raid_level": "raid1", 00:16:58.637 "base_bdevs": [ 00:16:58.637 "malloc1", 00:16:58.637 "malloc2", 00:16:58.637 "malloc3" 00:16:58.637 ], 00:16:58.637 "superblock": false, 00:16:58.637 "method": "bdev_raid_create", 00:16:58.637 "req_id": 1 00:16:58.637 } 00:16:58.637 Got JSON-RPC error response 00:16:58.637 response: 00:16:58.637 { 00:16:58.637 "code": -17, 00:16:58.637 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:16:58.637 } 00:16:58.637 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:16:58.637 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:16:58.637 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:16:58.637 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:16:58.637 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.637 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:58.897 [2024-07-25 13:25:39.607319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:58.897 [2024-07-25 13:25:39.607342] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:58.897 [2024-07-25 13:25:39.607352] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x124cbe0 00:16:58.897 [2024-07-25 13:25:39.607359] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:58.897 [2024-07-25 13:25:39.608615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:58.897 [2024-07-25 13:25:39.608636] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:58.897 [2024-07-25 13:25:39.608678] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:58.897 [2024-07-25 13:25:39.608697] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:58.897 pt1 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:58.897 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:59.156 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.156 "name": "raid_bdev1", 00:16:59.156 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:16:59.156 "strip_size_kb": 0, 00:16:59.156 "state": "configuring", 00:16:59.156 "raid_level": "raid1", 00:16:59.156 "superblock": true, 00:16:59.156 "num_base_bdevs": 3, 00:16:59.156 "num_base_bdevs_discovered": 1, 00:16:59.156 "num_base_bdevs_operational": 3, 00:16:59.156 "base_bdevs_list": [ 00:16:59.156 { 00:16:59.156 "name": "pt1", 00:16:59.156 "uuid": "00000000-0000-0000-0000-000000000001", 00:16:59.156 "is_configured": true, 00:16:59.156 "data_offset": 2048, 00:16:59.156 "data_size": 63488 00:16:59.156 }, 00:16:59.156 { 00:16:59.156 "name": null, 00:16:59.156 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:59.156 "is_configured": false, 00:16:59.156 "data_offset": 2048, 00:16:59.156 "data_size": 63488 00:16:59.156 }, 00:16:59.156 { 00:16:59.156 "name": null, 00:16:59.156 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:59.156 "is_configured": false, 00:16:59.156 "data_offset": 2048, 00:16:59.156 "data_size": 63488 00:16:59.156 } 00:16:59.156 ] 00:16:59.156 }' 00:16:59.156 13:25:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.156 13:25:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:59.724 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 3 -gt 2 ']' 00:16:59.724 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:59.984 [2024-07-25 13:25:40.549724] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:59.984 [2024-07-25 13:25:40.549766] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:59.984 [2024-07-25 13:25:40.549777] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13f0410 00:16:59.984 [2024-07-25 13:25:40.549783] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:59.984 [2024-07-25 13:25:40.550053] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:59.984 [2024-07-25 13:25:40.550064] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:59.984 [2024-07-25 13:25:40.550107] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:59.984 [2024-07-25 13:25:40.550121] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:59.984 pt2 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:59.984 [2024-07-25 13:25:40.746224] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.984 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:00.244 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:00.244 "name": "raid_bdev1", 00:17:00.244 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:00.244 "strip_size_kb": 0, 00:17:00.244 "state": "configuring", 00:17:00.244 "raid_level": "raid1", 00:17:00.244 "superblock": true, 00:17:00.244 "num_base_bdevs": 3, 00:17:00.244 "num_base_bdevs_discovered": 1, 00:17:00.244 "num_base_bdevs_operational": 3, 00:17:00.244 "base_bdevs_list": [ 00:17:00.244 { 00:17:00.244 "name": "pt1", 00:17:00.244 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:00.244 "is_configured": true, 00:17:00.244 "data_offset": 2048, 00:17:00.244 "data_size": 63488 00:17:00.244 }, 00:17:00.244 { 00:17:00.244 "name": null, 00:17:00.244 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:00.244 "is_configured": false, 00:17:00.244 "data_offset": 2048, 00:17:00.244 "data_size": 63488 00:17:00.244 }, 00:17:00.244 { 00:17:00.244 "name": null, 00:17:00.244 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:00.244 "is_configured": false, 00:17:00.244 "data_offset": 2048, 00:17:00.244 "data_size": 63488 00:17:00.244 } 00:17:00.244 ] 00:17:00.244 }' 00:17:00.244 13:25:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:00.244 13:25:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:00.814 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:17:00.814 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:00.814 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:01.074 [2024-07-25 13:25:41.692642] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:01.074 [2024-07-25 13:25:41.692674] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:01.074 [2024-07-25 13:25:41.692683] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1243d30 00:17:01.074 [2024-07-25 13:25:41.692694] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:01.074 [2024-07-25 13:25:41.692961] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:01.074 [2024-07-25 13:25:41.692973] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:01.074 [2024-07-25 13:25:41.693015] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:01.074 [2024-07-25 13:25:41.693028] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:01.074 pt2 00:17:01.074 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:01.074 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:01.074 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:01.335 [2024-07-25 13:25:41.869090] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:01.335 [2024-07-25 13:25:41.869110] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:01.335 [2024-07-25 13:25:41.869120] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1243300 00:17:01.335 [2024-07-25 13:25:41.869126] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:01.335 [2024-07-25 13:25:41.869342] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:01.335 [2024-07-25 13:25:41.869352] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:01.335 [2024-07-25 13:25:41.869384] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:01.335 [2024-07-25 13:25:41.869394] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:01.336 [2024-07-25 13:25:41.869474] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12437c0 00:17:01.336 [2024-07-25 13:25:41.869480] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:01.336 [2024-07-25 13:25:41.869616] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e7560 00:17:01.336 [2024-07-25 13:25:41.869718] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12437c0 00:17:01.336 [2024-07-25 13:25:41.869724] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12437c0 00:17:01.336 [2024-07-25 13:25:41.869794] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:01.336 pt3 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.336 13:25:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:01.336 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.336 "name": "raid_bdev1", 00:17:01.336 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:01.336 "strip_size_kb": 0, 00:17:01.336 "state": "online", 00:17:01.336 "raid_level": "raid1", 00:17:01.336 "superblock": true, 00:17:01.336 "num_base_bdevs": 3, 00:17:01.336 "num_base_bdevs_discovered": 3, 00:17:01.336 "num_base_bdevs_operational": 3, 00:17:01.336 "base_bdevs_list": [ 00:17:01.336 { 00:17:01.336 "name": "pt1", 00:17:01.336 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:01.336 "is_configured": true, 00:17:01.336 "data_offset": 2048, 00:17:01.336 "data_size": 63488 00:17:01.336 }, 00:17:01.336 { 00:17:01.336 "name": "pt2", 00:17:01.336 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:01.336 "is_configured": true, 00:17:01.336 "data_offset": 2048, 00:17:01.336 "data_size": 63488 00:17:01.336 }, 00:17:01.336 { 00:17:01.336 "name": "pt3", 00:17:01.336 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:01.336 "is_configured": true, 00:17:01.336 "data_offset": 2048, 00:17:01.336 "data_size": 63488 00:17:01.336 } 00:17:01.336 ] 00:17:01.336 }' 00:17:01.336 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.336 13:25:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:01.906 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:02.165 [2024-07-25 13:25:42.815706] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:02.165 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:02.165 "name": "raid_bdev1", 00:17:02.165 "aliases": [ 00:17:02.166 "267000e3-6249-4a45-92fd-4ef03f31123c" 00:17:02.166 ], 00:17:02.166 "product_name": "Raid Volume", 00:17:02.166 "block_size": 512, 00:17:02.166 "num_blocks": 63488, 00:17:02.166 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:02.166 "assigned_rate_limits": { 00:17:02.166 "rw_ios_per_sec": 0, 00:17:02.166 "rw_mbytes_per_sec": 0, 00:17:02.166 "r_mbytes_per_sec": 0, 00:17:02.166 "w_mbytes_per_sec": 0 00:17:02.166 }, 00:17:02.166 "claimed": false, 00:17:02.166 "zoned": false, 00:17:02.166 "supported_io_types": { 00:17:02.166 "read": true, 00:17:02.166 "write": true, 00:17:02.166 "unmap": false, 00:17:02.166 "flush": false, 00:17:02.166 "reset": true, 00:17:02.166 "nvme_admin": false, 00:17:02.166 "nvme_io": false, 00:17:02.166 "nvme_io_md": false, 00:17:02.166 "write_zeroes": true, 00:17:02.166 "zcopy": false, 00:17:02.166 "get_zone_info": false, 00:17:02.166 "zone_management": false, 00:17:02.166 "zone_append": false, 00:17:02.166 "compare": false, 00:17:02.166 "compare_and_write": false, 00:17:02.166 "abort": false, 00:17:02.166 "seek_hole": false, 00:17:02.166 "seek_data": false, 00:17:02.166 "copy": false, 00:17:02.166 "nvme_iov_md": false 00:17:02.166 }, 00:17:02.166 "memory_domains": [ 00:17:02.166 { 00:17:02.166 "dma_device_id": "system", 00:17:02.166 "dma_device_type": 1 00:17:02.166 }, 00:17:02.166 { 00:17:02.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.166 "dma_device_type": 2 00:17:02.166 }, 00:17:02.166 { 00:17:02.166 "dma_device_id": "system", 00:17:02.166 "dma_device_type": 1 00:17:02.166 }, 00:17:02.166 { 00:17:02.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.166 "dma_device_type": 2 00:17:02.166 }, 00:17:02.166 { 00:17:02.166 "dma_device_id": "system", 00:17:02.166 "dma_device_type": 1 00:17:02.166 }, 00:17:02.166 { 00:17:02.166 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.166 "dma_device_type": 2 00:17:02.166 } 00:17:02.166 ], 00:17:02.166 "driver_specific": { 00:17:02.166 "raid": { 00:17:02.166 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:02.166 "strip_size_kb": 0, 00:17:02.166 "state": "online", 00:17:02.166 "raid_level": "raid1", 00:17:02.166 "superblock": true, 00:17:02.166 "num_base_bdevs": 3, 00:17:02.166 "num_base_bdevs_discovered": 3, 00:17:02.166 "num_base_bdevs_operational": 3, 00:17:02.166 "base_bdevs_list": [ 00:17:02.166 { 00:17:02.166 "name": "pt1", 00:17:02.166 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:02.166 "is_configured": true, 00:17:02.166 "data_offset": 2048, 00:17:02.166 "data_size": 63488 00:17:02.166 }, 00:17:02.166 { 00:17:02.166 "name": "pt2", 00:17:02.166 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.166 "is_configured": true, 00:17:02.166 "data_offset": 2048, 00:17:02.166 "data_size": 63488 00:17:02.166 }, 00:17:02.166 { 00:17:02.166 "name": "pt3", 00:17:02.166 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:02.166 "is_configured": true, 00:17:02.166 "data_offset": 2048, 00:17:02.166 "data_size": 63488 00:17:02.166 } 00:17:02.166 ] 00:17:02.166 } 00:17:02.166 } 00:17:02.166 }' 00:17:02.166 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:02.166 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:02.166 pt2 00:17:02.166 pt3' 00:17:02.166 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.166 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:02.166 13:25:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.426 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.426 "name": "pt1", 00:17:02.426 "aliases": [ 00:17:02.426 "00000000-0000-0000-0000-000000000001" 00:17:02.426 ], 00:17:02.426 "product_name": "passthru", 00:17:02.426 "block_size": 512, 00:17:02.426 "num_blocks": 65536, 00:17:02.426 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:02.426 "assigned_rate_limits": { 00:17:02.426 "rw_ios_per_sec": 0, 00:17:02.426 "rw_mbytes_per_sec": 0, 00:17:02.426 "r_mbytes_per_sec": 0, 00:17:02.426 "w_mbytes_per_sec": 0 00:17:02.426 }, 00:17:02.426 "claimed": true, 00:17:02.426 "claim_type": "exclusive_write", 00:17:02.426 "zoned": false, 00:17:02.426 "supported_io_types": { 00:17:02.426 "read": true, 00:17:02.426 "write": true, 00:17:02.426 "unmap": true, 00:17:02.426 "flush": true, 00:17:02.426 "reset": true, 00:17:02.426 "nvme_admin": false, 00:17:02.426 "nvme_io": false, 00:17:02.426 "nvme_io_md": false, 00:17:02.426 "write_zeroes": true, 00:17:02.426 "zcopy": true, 00:17:02.426 "get_zone_info": false, 00:17:02.426 "zone_management": false, 00:17:02.426 "zone_append": false, 00:17:02.426 "compare": false, 00:17:02.426 "compare_and_write": false, 00:17:02.426 "abort": true, 00:17:02.426 "seek_hole": false, 00:17:02.426 "seek_data": false, 00:17:02.426 "copy": true, 00:17:02.426 "nvme_iov_md": false 00:17:02.426 }, 00:17:02.426 "memory_domains": [ 00:17:02.426 { 00:17:02.426 "dma_device_id": "system", 00:17:02.426 "dma_device_type": 1 00:17:02.426 }, 00:17:02.426 { 00:17:02.426 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.426 "dma_device_type": 2 00:17:02.426 } 00:17:02.426 ], 00:17:02.426 "driver_specific": { 00:17:02.426 "passthru": { 00:17:02.426 "name": "pt1", 00:17:02.426 "base_bdev_name": "malloc1" 00:17:02.426 } 00:17:02.426 } 00:17:02.426 }' 00:17:02.426 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.426 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.426 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.426 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:02.687 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:02.947 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:02.947 "name": "pt2", 00:17:02.947 "aliases": [ 00:17:02.947 "00000000-0000-0000-0000-000000000002" 00:17:02.947 ], 00:17:02.947 "product_name": "passthru", 00:17:02.947 "block_size": 512, 00:17:02.947 "num_blocks": 65536, 00:17:02.947 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:02.947 "assigned_rate_limits": { 00:17:02.947 "rw_ios_per_sec": 0, 00:17:02.947 "rw_mbytes_per_sec": 0, 00:17:02.947 "r_mbytes_per_sec": 0, 00:17:02.947 "w_mbytes_per_sec": 0 00:17:02.947 }, 00:17:02.947 "claimed": true, 00:17:02.947 "claim_type": "exclusive_write", 00:17:02.947 "zoned": false, 00:17:02.947 "supported_io_types": { 00:17:02.947 "read": true, 00:17:02.947 "write": true, 00:17:02.947 "unmap": true, 00:17:02.947 "flush": true, 00:17:02.947 "reset": true, 00:17:02.947 "nvme_admin": false, 00:17:02.947 "nvme_io": false, 00:17:02.947 "nvme_io_md": false, 00:17:02.947 "write_zeroes": true, 00:17:02.947 "zcopy": true, 00:17:02.947 "get_zone_info": false, 00:17:02.947 "zone_management": false, 00:17:02.947 "zone_append": false, 00:17:02.947 "compare": false, 00:17:02.947 "compare_and_write": false, 00:17:02.947 "abort": true, 00:17:02.947 "seek_hole": false, 00:17:02.947 "seek_data": false, 00:17:02.947 "copy": true, 00:17:02.947 "nvme_iov_md": false 00:17:02.947 }, 00:17:02.947 "memory_domains": [ 00:17:02.947 { 00:17:02.947 "dma_device_id": "system", 00:17:02.947 "dma_device_type": 1 00:17:02.947 }, 00:17:02.947 { 00:17:02.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.947 "dma_device_type": 2 00:17:02.947 } 00:17:02.947 ], 00:17:02.947 "driver_specific": { 00:17:02.947 "passthru": { 00:17:02.947 "name": "pt2", 00:17:02.947 "base_bdev_name": "malloc2" 00:17:02.947 } 00:17:02.947 } 00:17:02.947 }' 00:17:02.947 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.947 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:02.947 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:02.947 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:03.206 13:25:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:03.466 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:03.466 "name": "pt3", 00:17:03.466 "aliases": [ 00:17:03.466 "00000000-0000-0000-0000-000000000003" 00:17:03.466 ], 00:17:03.466 "product_name": "passthru", 00:17:03.466 "block_size": 512, 00:17:03.466 "num_blocks": 65536, 00:17:03.466 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:03.466 "assigned_rate_limits": { 00:17:03.466 "rw_ios_per_sec": 0, 00:17:03.466 "rw_mbytes_per_sec": 0, 00:17:03.466 "r_mbytes_per_sec": 0, 00:17:03.466 "w_mbytes_per_sec": 0 00:17:03.466 }, 00:17:03.466 "claimed": true, 00:17:03.466 "claim_type": "exclusive_write", 00:17:03.466 "zoned": false, 00:17:03.466 "supported_io_types": { 00:17:03.466 "read": true, 00:17:03.466 "write": true, 00:17:03.466 "unmap": true, 00:17:03.466 "flush": true, 00:17:03.466 "reset": true, 00:17:03.466 "nvme_admin": false, 00:17:03.466 "nvme_io": false, 00:17:03.466 "nvme_io_md": false, 00:17:03.466 "write_zeroes": true, 00:17:03.466 "zcopy": true, 00:17:03.466 "get_zone_info": false, 00:17:03.466 "zone_management": false, 00:17:03.466 "zone_append": false, 00:17:03.466 "compare": false, 00:17:03.466 "compare_and_write": false, 00:17:03.466 "abort": true, 00:17:03.466 "seek_hole": false, 00:17:03.466 "seek_data": false, 00:17:03.466 "copy": true, 00:17:03.466 "nvme_iov_md": false 00:17:03.466 }, 00:17:03.466 "memory_domains": [ 00:17:03.466 { 00:17:03.466 "dma_device_id": "system", 00:17:03.466 "dma_device_type": 1 00:17:03.466 }, 00:17:03.466 { 00:17:03.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:03.466 "dma_device_type": 2 00:17:03.466 } 00:17:03.466 ], 00:17:03.466 "driver_specific": { 00:17:03.466 "passthru": { 00:17:03.466 "name": "pt3", 00:17:03.466 "base_bdev_name": "malloc3" 00:17:03.466 } 00:17:03.466 } 00:17:03.466 }' 00:17:03.466 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.466 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.725 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:17:03.986 [2024-07-25 13:25:44.724534] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 267000e3-6249-4a45-92fd-4ef03f31123c '!=' 267000e3-6249-4a45-92fd-4ef03f31123c ']' 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:03.986 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:04.246 [2024-07-25 13:25:44.916818] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.246 13:25:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:04.506 13:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.506 "name": "raid_bdev1", 00:17:04.506 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:04.506 "strip_size_kb": 0, 00:17:04.506 "state": "online", 00:17:04.506 "raid_level": "raid1", 00:17:04.506 "superblock": true, 00:17:04.507 "num_base_bdevs": 3, 00:17:04.507 "num_base_bdevs_discovered": 2, 00:17:04.507 "num_base_bdevs_operational": 2, 00:17:04.507 "base_bdevs_list": [ 00:17:04.507 { 00:17:04.507 "name": null, 00:17:04.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:04.507 "is_configured": false, 00:17:04.507 "data_offset": 2048, 00:17:04.507 "data_size": 63488 00:17:04.507 }, 00:17:04.507 { 00:17:04.507 "name": "pt2", 00:17:04.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:04.507 "is_configured": true, 00:17:04.507 "data_offset": 2048, 00:17:04.507 "data_size": 63488 00:17:04.507 }, 00:17:04.507 { 00:17:04.507 "name": "pt3", 00:17:04.507 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:04.507 "is_configured": true, 00:17:04.507 "data_offset": 2048, 00:17:04.507 "data_size": 63488 00:17:04.507 } 00:17:04.507 ] 00:17:04.507 }' 00:17:04.507 13:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.507 13:25:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:05.075 13:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:05.075 [2024-07-25 13:25:45.823095] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:05.075 [2024-07-25 13:25:45.823119] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:05.075 [2024-07-25 13:25:45.823157] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:05.075 [2024-07-25 13:25:45.823198] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:05.075 [2024-07-25 13:25:45.823204] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12437c0 name raid_bdev1, state offline 00:17:05.075 13:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.075 13:25:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:17:05.335 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:17:05.335 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:17:05.335 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:17:05.335 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:17:05.335 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:05.594 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:17:05.594 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:17:05.594 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:05.854 [2024-07-25 13:25:46.576985] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:05.854 [2024-07-25 13:25:46.577020] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:05.854 [2024-07-25 13:25:46.577031] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1245f90 00:17:05.854 [2024-07-25 13:25:46.577037] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:05.854 [2024-07-25 13:25:46.578309] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:05.854 [2024-07-25 13:25:46.578332] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:05.854 [2024-07-25 13:25:46.578380] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:05.854 [2024-07-25 13:25:46.578399] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:05.854 pt2 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:05.854 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:06.114 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.114 "name": "raid_bdev1", 00:17:06.114 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:06.114 "strip_size_kb": 0, 00:17:06.114 "state": "configuring", 00:17:06.114 "raid_level": "raid1", 00:17:06.114 "superblock": true, 00:17:06.114 "num_base_bdevs": 3, 00:17:06.114 "num_base_bdevs_discovered": 1, 00:17:06.114 "num_base_bdevs_operational": 2, 00:17:06.114 "base_bdevs_list": [ 00:17:06.114 { 00:17:06.114 "name": null, 00:17:06.114 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.114 "is_configured": false, 00:17:06.114 "data_offset": 2048, 00:17:06.114 "data_size": 63488 00:17:06.114 }, 00:17:06.114 { 00:17:06.114 "name": "pt2", 00:17:06.114 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.114 "is_configured": true, 00:17:06.114 "data_offset": 2048, 00:17:06.114 "data_size": 63488 00:17:06.114 }, 00:17:06.114 { 00:17:06.114 "name": null, 00:17:06.114 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:06.114 "is_configured": false, 00:17:06.114 "data_offset": 2048, 00:17:06.114 "data_size": 63488 00:17:06.114 } 00:17:06.114 ] 00:17:06.114 }' 00:17:06.114 13:25:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.114 13:25:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:06.684 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:17:06.684 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:17:06.684 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=2 00:17:06.684 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:06.684 [2024-07-25 13:25:47.471258] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:06.684 [2024-07-25 13:25:47.471294] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:06.684 [2024-07-25 13:25:47.471306] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x124ce50 00:17:06.684 [2024-07-25 13:25:47.471313] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:06.684 [2024-07-25 13:25:47.471612] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:06.684 [2024-07-25 13:25:47.471625] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:06.684 [2024-07-25 13:25:47.471671] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:06.684 [2024-07-25 13:25:47.471685] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:06.684 [2024-07-25 13:25:47.471764] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12476d0 00:17:06.684 [2024-07-25 13:25:47.471771] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:06.684 [2024-07-25 13:25:47.471904] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e6e10 00:17:06.684 [2024-07-25 13:25:47.472005] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12476d0 00:17:06.684 [2024-07-25 13:25:47.472010] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12476d0 00:17:06.684 [2024-07-25 13:25:47.472081] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:06.944 pt3 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:06.944 "name": "raid_bdev1", 00:17:06.944 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:06.944 "strip_size_kb": 0, 00:17:06.944 "state": "online", 00:17:06.944 "raid_level": "raid1", 00:17:06.944 "superblock": true, 00:17:06.944 "num_base_bdevs": 3, 00:17:06.944 "num_base_bdevs_discovered": 2, 00:17:06.944 "num_base_bdevs_operational": 2, 00:17:06.944 "base_bdevs_list": [ 00:17:06.944 { 00:17:06.944 "name": null, 00:17:06.944 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:06.944 "is_configured": false, 00:17:06.944 "data_offset": 2048, 00:17:06.944 "data_size": 63488 00:17:06.944 }, 00:17:06.944 { 00:17:06.944 "name": "pt2", 00:17:06.944 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:06.944 "is_configured": true, 00:17:06.944 "data_offset": 2048, 00:17:06.944 "data_size": 63488 00:17:06.944 }, 00:17:06.944 { 00:17:06.944 "name": "pt3", 00:17:06.944 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:06.944 "is_configured": true, 00:17:06.944 "data_offset": 2048, 00:17:06.944 "data_size": 63488 00:17:06.944 } 00:17:06.944 ] 00:17:06.944 }' 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:06.944 13:25:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:07.513 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:07.773 [2024-07-25 13:25:48.393591] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:07.773 [2024-07-25 13:25:48.393610] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:07.773 [2024-07-25 13:25:48.393650] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:07.773 [2024-07-25 13:25:48.393689] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:07.773 [2024-07-25 13:25:48.393696] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12476d0 name raid_bdev1, state offline 00:17:07.773 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.773 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:17:08.032 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:17:08.032 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:17:08.032 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 3 -gt 2 ']' 00:17:08.032 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=2 00:17:08.032 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:08.032 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:08.292 [2024-07-25 13:25:48.954993] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:08.292 [2024-07-25 13:25:48.955023] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:08.292 [2024-07-25 13:25:48.955034] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x124ce50 00:17:08.292 [2024-07-25 13:25:48.955040] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:08.292 [2024-07-25 13:25:48.956315] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:08.292 [2024-07-25 13:25:48.956342] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:08.292 [2024-07-25 13:25:48.956389] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:08.292 [2024-07-25 13:25:48.956408] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:08.292 [2024-07-25 13:25:48.956485] bdev_raid.c:3665:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:17:08.292 [2024-07-25 13:25:48.956493] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:08.292 [2024-07-25 13:25:48.956501] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12434a0 name raid_bdev1, state configuring 00:17:08.292 [2024-07-25 13:25:48.956515] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:08.292 pt1 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 3 -gt 2 ']' 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:08.292 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:08.293 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:08.293 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:08.293 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.293 13:25:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:08.552 13:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.552 "name": "raid_bdev1", 00:17:08.552 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:08.552 "strip_size_kb": 0, 00:17:08.552 "state": "configuring", 00:17:08.552 "raid_level": "raid1", 00:17:08.552 "superblock": true, 00:17:08.552 "num_base_bdevs": 3, 00:17:08.552 "num_base_bdevs_discovered": 1, 00:17:08.552 "num_base_bdevs_operational": 2, 00:17:08.552 "base_bdevs_list": [ 00:17:08.552 { 00:17:08.552 "name": null, 00:17:08.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.552 "is_configured": false, 00:17:08.552 "data_offset": 2048, 00:17:08.552 "data_size": 63488 00:17:08.552 }, 00:17:08.552 { 00:17:08.552 "name": "pt2", 00:17:08.552 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:08.552 "is_configured": true, 00:17:08.552 "data_offset": 2048, 00:17:08.552 "data_size": 63488 00:17:08.552 }, 00:17:08.552 { 00:17:08.552 "name": null, 00:17:08.552 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:08.552 "is_configured": false, 00:17:08.552 "data_offset": 2048, 00:17:08.552 "data_size": 63488 00:17:08.552 } 00:17:08.552 ] 00:17:08.552 }' 00:17:08.552 13:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.552 13:25:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:09.122 13:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:17:09.122 13:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:09.122 13:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:17:09.122 13:25:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:09.383 [2024-07-25 13:25:50.085877] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:09.383 [2024-07-25 13:25:50.085919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:09.383 [2024-07-25 13:25:50.085935] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1244130 00:17:09.383 [2024-07-25 13:25:50.085942] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:09.383 [2024-07-25 13:25:50.086213] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:09.383 [2024-07-25 13:25:50.086226] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:09.383 [2024-07-25 13:25:50.086271] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:09.383 [2024-07-25 13:25:50.086285] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:09.383 [2024-07-25 13:25:50.086363] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1247c00 00:17:09.383 [2024-07-25 13:25:50.086370] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:09.383 [2024-07-25 13:25:50.086501] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e7560 00:17:09.383 [2024-07-25 13:25:50.086615] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1247c00 00:17:09.383 [2024-07-25 13:25:50.086622] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1247c00 00:17:09.383 [2024-07-25 13:25:50.086693] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:09.383 pt3 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:09.383 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.643 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:09.643 "name": "raid_bdev1", 00:17:09.643 "uuid": "267000e3-6249-4a45-92fd-4ef03f31123c", 00:17:09.643 "strip_size_kb": 0, 00:17:09.643 "state": "online", 00:17:09.643 "raid_level": "raid1", 00:17:09.643 "superblock": true, 00:17:09.643 "num_base_bdevs": 3, 00:17:09.643 "num_base_bdevs_discovered": 2, 00:17:09.643 "num_base_bdevs_operational": 2, 00:17:09.643 "base_bdevs_list": [ 00:17:09.643 { 00:17:09.643 "name": null, 00:17:09.643 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:09.643 "is_configured": false, 00:17:09.643 "data_offset": 2048, 00:17:09.643 "data_size": 63488 00:17:09.643 }, 00:17:09.643 { 00:17:09.643 "name": "pt2", 00:17:09.643 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:09.643 "is_configured": true, 00:17:09.643 "data_offset": 2048, 00:17:09.643 "data_size": 63488 00:17:09.643 }, 00:17:09.643 { 00:17:09.643 "name": "pt3", 00:17:09.643 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:09.643 "is_configured": true, 00:17:09.643 "data_offset": 2048, 00:17:09.643 "data_size": 63488 00:17:09.643 } 00:17:09.643 ] 00:17:09.643 }' 00:17:09.643 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:09.643 13:25:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.213 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:17:10.213 13:25:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:17:10.474 [2024-07-25 13:25:51.196885] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' 267000e3-6249-4a45-92fd-4ef03f31123c '!=' 267000e3-6249-4a45-92fd-4ef03f31123c ']' 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 935505 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 935505 ']' 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 935505 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 935505 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 935505' 00:17:10.474 killing process with pid 935505 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 935505 00:17:10.474 [2024-07-25 13:25:51.263243] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:10.474 [2024-07-25 13:25:51.263284] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:10.474 [2024-07-25 13:25:51.263322] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:10.474 [2024-07-25 13:25:51.263328] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1247c00 name raid_bdev1, state offline 00:17:10.474 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 935505 00:17:10.734 [2024-07-25 13:25:51.278388] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:10.734 13:25:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:17:10.734 00:17:10.734 real 0m18.586s 00:17:10.734 user 0m34.725s 00:17:10.734 sys 0m2.740s 00:17:10.734 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:10.734 13:25:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.734 ************************************ 00:17:10.734 END TEST raid_superblock_test 00:17:10.734 ************************************ 00:17:10.734 13:25:51 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:17:10.734 13:25:51 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:10.734 13:25:51 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:10.734 13:25:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:10.734 ************************************ 00:17:10.734 START TEST raid_read_error_test 00:17:10.734 ************************************ 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 read 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:17:10.734 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.S7bppgu7bZ 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=939050 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 939050 /var/tmp/spdk-raid.sock 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 939050 ']' 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:10.735 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:10.735 13:25:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:10.996 [2024-07-25 13:25:51.542265] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:17:10.996 [2024-07-25 13:25:51.542314] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid939050 ] 00:17:10.996 [2024-07-25 13:25:51.628913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:10.996 [2024-07-25 13:25:51.701556] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:10.996 [2024-07-25 13:25:51.752013] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:10.996 [2024-07-25 13:25:51.752039] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:11.256 13:25:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:11.256 13:25:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:11.256 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:11.256 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:11.516 BaseBdev1_malloc 00:17:11.516 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:11.776 true 00:17:11.776 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:12.036 [2024-07-25 13:25:52.583241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:12.036 [2024-07-25 13:25:52.583273] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.036 [2024-07-25 13:25:52.583284] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12282a0 00:17:12.036 [2024-07-25 13:25:52.583290] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.036 [2024-07-25 13:25:52.584575] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.036 [2024-07-25 13:25:52.584596] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:12.036 BaseBdev1 00:17:12.036 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:12.036 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:12.036 BaseBdev2_malloc 00:17:12.036 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:12.295 true 00:17:12.296 13:25:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:12.556 [2024-07-25 13:25:53.138569] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:12.556 [2024-07-25 13:25:53.138597] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:12.556 [2024-07-25 13:25:53.138610] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e7420 00:17:12.556 [2024-07-25 13:25:53.138616] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:12.556 [2024-07-25 13:25:53.139787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:12.556 [2024-07-25 13:25:53.139805] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:12.556 BaseBdev2 00:17:12.556 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:12.556 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:12.556 BaseBdev3_malloc 00:17:12.556 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:12.815 true 00:17:12.815 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:13.075 [2024-07-25 13:25:53.709952] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:13.075 [2024-07-25 13:25:53.709982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:13.075 [2024-07-25 13:25:53.709996] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e8f70 00:17:13.075 [2024-07-25 13:25:53.710002] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:13.075 [2024-07-25 13:25:53.711182] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:13.075 [2024-07-25 13:25:53.711202] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:13.075 BaseBdev3 00:17:13.075 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:13.335 [2024-07-25 13:25:53.898447] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:13.335 [2024-07-25 13:25:53.899458] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:13.335 [2024-07-25 13:25:53.899514] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:13.335 [2024-07-25 13:25:53.899666] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12eacc0 00:17:13.335 [2024-07-25 13:25:53.899673] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:13.335 [2024-07-25 13:25:53.899820] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12ebfd0 00:17:13.335 [2024-07-25 13:25:53.899938] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12eacc0 00:17:13.335 [2024-07-25 13:25:53.899944] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12eacc0 00:17:13.335 [2024-07-25 13:25:53.900031] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.335 13:25:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:13.595 13:25:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.595 "name": "raid_bdev1", 00:17:13.595 "uuid": "0f0a510a-38de-437e-ab5c-325ffd703563", 00:17:13.595 "strip_size_kb": 0, 00:17:13.595 "state": "online", 00:17:13.595 "raid_level": "raid1", 00:17:13.595 "superblock": true, 00:17:13.595 "num_base_bdevs": 3, 00:17:13.595 "num_base_bdevs_discovered": 3, 00:17:13.595 "num_base_bdevs_operational": 3, 00:17:13.595 "base_bdevs_list": [ 00:17:13.595 { 00:17:13.595 "name": "BaseBdev1", 00:17:13.595 "uuid": "8593ecc3-7849-51fa-9dff-60b104313bf0", 00:17:13.595 "is_configured": true, 00:17:13.595 "data_offset": 2048, 00:17:13.595 "data_size": 63488 00:17:13.595 }, 00:17:13.595 { 00:17:13.595 "name": "BaseBdev2", 00:17:13.595 "uuid": "e87b27f0-0082-5907-aa76-dc5b4bce9e53", 00:17:13.595 "is_configured": true, 00:17:13.595 "data_offset": 2048, 00:17:13.595 "data_size": 63488 00:17:13.595 }, 00:17:13.595 { 00:17:13.595 "name": "BaseBdev3", 00:17:13.595 "uuid": "d87688fe-cfd6-5aac-ab07-dcf2476dc7b7", 00:17:13.595 "is_configured": true, 00:17:13.595 "data_offset": 2048, 00:17:13.595 "data_size": 63488 00:17:13.595 } 00:17:13.595 ] 00:17:13.595 }' 00:17:13.595 13:25:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.595 13:25:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:14.165 13:25:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:17:14.165 13:25:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:14.165 [2024-07-25 13:25:54.772909] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13d5620 00:17:15.104 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:15.104 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=3 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.105 13:25:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:15.403 13:25:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.403 "name": "raid_bdev1", 00:17:15.403 "uuid": "0f0a510a-38de-437e-ab5c-325ffd703563", 00:17:15.403 "strip_size_kb": 0, 00:17:15.403 "state": "online", 00:17:15.403 "raid_level": "raid1", 00:17:15.403 "superblock": true, 00:17:15.403 "num_base_bdevs": 3, 00:17:15.403 "num_base_bdevs_discovered": 3, 00:17:15.403 "num_base_bdevs_operational": 3, 00:17:15.403 "base_bdevs_list": [ 00:17:15.403 { 00:17:15.403 "name": "BaseBdev1", 00:17:15.403 "uuid": "8593ecc3-7849-51fa-9dff-60b104313bf0", 00:17:15.403 "is_configured": true, 00:17:15.403 "data_offset": 2048, 00:17:15.403 "data_size": 63488 00:17:15.403 }, 00:17:15.403 { 00:17:15.403 "name": "BaseBdev2", 00:17:15.403 "uuid": "e87b27f0-0082-5907-aa76-dc5b4bce9e53", 00:17:15.403 "is_configured": true, 00:17:15.403 "data_offset": 2048, 00:17:15.403 "data_size": 63488 00:17:15.403 }, 00:17:15.403 { 00:17:15.403 "name": "BaseBdev3", 00:17:15.403 "uuid": "d87688fe-cfd6-5aac-ab07-dcf2476dc7b7", 00:17:15.403 "is_configured": true, 00:17:15.403 "data_offset": 2048, 00:17:15.403 "data_size": 63488 00:17:15.403 } 00:17:15.403 ] 00:17:15.403 }' 00:17:15.403 13:25:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.403 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:15.971 13:25:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:16.231 [2024-07-25 13:25:56.798520] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:16.231 [2024-07-25 13:25:56.798557] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:16.231 [2024-07-25 13:25:56.801122] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:16.231 [2024-07-25 13:25:56.801147] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:16.231 [2024-07-25 13:25:56.801224] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:16.231 [2024-07-25 13:25:56.801230] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12eacc0 name raid_bdev1, state offline 00:17:16.231 0 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 939050 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 939050 ']' 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 939050 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 939050 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:16.231 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:16.232 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 939050' 00:17:16.232 killing process with pid 939050 00:17:16.232 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 939050 00:17:16.232 [2024-07-25 13:25:56.882041] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:16.232 13:25:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 939050 00:17:16.232 [2024-07-25 13:25:56.893614] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.S7bppgu7bZ 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:16.232 00:17:16.232 real 0m5.548s 00:17:16.232 user 0m9.190s 00:17:16.232 sys 0m0.846s 00:17:16.232 13:25:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:16.492 13:25:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.492 ************************************ 00:17:16.492 END TEST raid_read_error_test 00:17:16.492 ************************************ 00:17:16.492 13:25:57 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:17:16.492 13:25:57 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:16.492 13:25:57 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:16.492 13:25:57 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:16.492 ************************************ 00:17:16.492 START TEST raid_write_error_test 00:17:16.492 ************************************ 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 3 write 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=3 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.vZ8nV74SOI 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=940069 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 940069 /var/tmp/spdk-raid.sock 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 940069 ']' 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:16.492 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:16.492 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:16.492 [2024-07-25 13:25:57.212319] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:17:16.492 [2024-07-25 13:25:57.212446] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid940069 ] 00:17:16.752 [2024-07-25 13:25:57.354521] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:16.752 [2024-07-25 13:25:57.430748] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:16.752 [2024-07-25 13:25:57.481872] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:16.752 [2024-07-25 13:25:57.481900] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:17.012 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:17.012 13:25:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:17:17.012 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:17.012 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:17.273 BaseBdev1_malloc 00:17:17.273 13:25:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:17.273 true 00:17:17.273 13:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:17.534 [2024-07-25 13:25:58.216564] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:17.534 [2024-07-25 13:25:58.216597] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:17.534 [2024-07-25 13:25:58.216609] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24532a0 00:17:17.534 [2024-07-25 13:25:58.216615] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:17.534 [2024-07-25 13:25:58.217989] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:17.534 [2024-07-25 13:25:58.218010] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:17.534 BaseBdev1 00:17:17.534 13:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:17.534 13:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:17.795 BaseBdev2_malloc 00:17:17.795 13:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:18.055 true 00:17:18.055 13:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:18.055 [2024-07-25 13:25:58.795918] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:18.055 [2024-07-25 13:25:58.795948] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.055 [2024-07-25 13:25:58.795961] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2512420 00:17:18.055 [2024-07-25 13:25:58.795967] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.055 [2024-07-25 13:25:58.797149] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.055 [2024-07-25 13:25:58.797168] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:18.055 BaseBdev2 00:17:18.055 13:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:17:18.055 13:25:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:18.315 BaseBdev3_malloc 00:17:18.315 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:18.575 true 00:17:18.575 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:18.835 [2024-07-25 13:25:59.375262] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:18.835 [2024-07-25 13:25:59.375292] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:18.835 [2024-07-25 13:25:59.375305] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2513f70 00:17:18.835 [2024-07-25 13:25:59.375311] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:18.835 [2024-07-25 13:25:59.376498] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:18.835 [2024-07-25 13:25:59.376518] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:18.835 BaseBdev3 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:17:18.835 [2024-07-25 13:25:59.563766] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:18.835 [2024-07-25 13:25:59.564768] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:18.835 [2024-07-25 13:25:59.564823] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:18.835 [2024-07-25 13:25:59.564966] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2515cc0 00:17:18.835 [2024-07-25 13:25:59.564973] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:17:18.835 [2024-07-25 13:25:59.565122] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2516fd0 00:17:18.835 [2024-07-25 13:25:59.565240] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2515cc0 00:17:18.835 [2024-07-25 13:25:59.565245] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2515cc0 00:17:18.835 [2024-07-25 13:25:59.565332] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.835 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:19.095 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.095 "name": "raid_bdev1", 00:17:19.095 "uuid": "911200b8-2765-4366-a630-a931dea096ff", 00:17:19.095 "strip_size_kb": 0, 00:17:19.095 "state": "online", 00:17:19.095 "raid_level": "raid1", 00:17:19.095 "superblock": true, 00:17:19.095 "num_base_bdevs": 3, 00:17:19.095 "num_base_bdevs_discovered": 3, 00:17:19.095 "num_base_bdevs_operational": 3, 00:17:19.095 "base_bdevs_list": [ 00:17:19.095 { 00:17:19.095 "name": "BaseBdev1", 00:17:19.095 "uuid": "345b5779-da6d-50c2-bf7b-8d3837f890dc", 00:17:19.095 "is_configured": true, 00:17:19.095 "data_offset": 2048, 00:17:19.095 "data_size": 63488 00:17:19.095 }, 00:17:19.095 { 00:17:19.095 "name": "BaseBdev2", 00:17:19.095 "uuid": "35b60de5-8ada-5ae1-b93a-6dff57e0495f", 00:17:19.095 "is_configured": true, 00:17:19.095 "data_offset": 2048, 00:17:19.095 "data_size": 63488 00:17:19.095 }, 00:17:19.095 { 00:17:19.095 "name": "BaseBdev3", 00:17:19.095 "uuid": "d1c3816d-09be-5671-a217-6c88e923621f", 00:17:19.095 "is_configured": true, 00:17:19.095 "data_offset": 2048, 00:17:19.095 "data_size": 63488 00:17:19.095 } 00:17:19.095 ] 00:17:19.095 }' 00:17:19.095 13:25:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.095 13:25:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:19.667 13:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:17:19.667 13:26:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:19.667 [2024-07-25 13:26:00.442246] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2600620 00:17:20.607 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:20.867 [2024-07-25 13:26:01.529623] bdev_raid.c:2263:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:17:20.867 [2024-07-25 13:26:01.529668] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:20.867 [2024-07-25 13:26:01.529846] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x2600620 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=2 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:20.867 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.127 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:21.127 "name": "raid_bdev1", 00:17:21.127 "uuid": "911200b8-2765-4366-a630-a931dea096ff", 00:17:21.127 "strip_size_kb": 0, 00:17:21.127 "state": "online", 00:17:21.127 "raid_level": "raid1", 00:17:21.127 "superblock": true, 00:17:21.127 "num_base_bdevs": 3, 00:17:21.127 "num_base_bdevs_discovered": 2, 00:17:21.127 "num_base_bdevs_operational": 2, 00:17:21.127 "base_bdevs_list": [ 00:17:21.127 { 00:17:21.127 "name": null, 00:17:21.127 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:21.127 "is_configured": false, 00:17:21.127 "data_offset": 2048, 00:17:21.127 "data_size": 63488 00:17:21.127 }, 00:17:21.127 { 00:17:21.127 "name": "BaseBdev2", 00:17:21.127 "uuid": "35b60de5-8ada-5ae1-b93a-6dff57e0495f", 00:17:21.127 "is_configured": true, 00:17:21.127 "data_offset": 2048, 00:17:21.127 "data_size": 63488 00:17:21.127 }, 00:17:21.127 { 00:17:21.127 "name": "BaseBdev3", 00:17:21.127 "uuid": "d1c3816d-09be-5671-a217-6c88e923621f", 00:17:21.127 "is_configured": true, 00:17:21.127 "data_offset": 2048, 00:17:21.127 "data_size": 63488 00:17:21.127 } 00:17:21.127 ] 00:17:21.127 }' 00:17:21.127 13:26:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:21.127 13:26:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:21.699 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:22.022 [2024-07-25 13:26:02.509463] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:22.022 [2024-07-25 13:26:02.509489] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:22.022 [2024-07-25 13:26:02.512089] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:22.022 [2024-07-25 13:26:02.512112] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:22.022 [2024-07-25 13:26:02.512171] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:22.022 [2024-07-25 13:26:02.512178] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2515cc0 name raid_bdev1, state offline 00:17:22.022 0 00:17:22.022 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 940069 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 940069 ']' 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 940069 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 940069 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 940069' 00:17:22.023 killing process with pid 940069 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 940069 00:17:22.023 [2024-07-25 13:26:02.578853] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 940069 00:17:22.023 [2024-07-25 13:26:02.589963] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.vZ8nV74SOI 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:17:22.023 00:17:22.023 real 0m5.625s 00:17:22.023 user 0m9.247s 00:17:22.023 sys 0m0.915s 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:22.023 13:26:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.023 ************************************ 00:17:22.023 END TEST raid_write_error_test 00:17:22.023 ************************************ 00:17:22.023 13:26:02 bdev_raid -- bdev/bdev_raid.sh@945 -- # for n in {2..4} 00:17:22.023 13:26:02 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:17:22.023 13:26:02 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:17:22.023 13:26:02 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:22.023 13:26:02 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:22.023 13:26:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:22.289 ************************************ 00:17:22.289 START TEST raid_state_function_test 00:17:22.289 ************************************ 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 false 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=941140 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 941140' 00:17:22.289 Process raid pid: 941140 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 941140 /var/tmp/spdk-raid.sock 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 941140 ']' 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:22.289 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:22.289 13:26:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:22.289 [2024-07-25 13:26:02.877564] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:17:22.289 [2024-07-25 13:26:02.877620] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:22.289 [2024-07-25 13:26:02.970087] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:22.289 [2024-07-25 13:26:03.036982] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:22.289 [2024-07-25 13:26:03.077586] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:22.289 [2024-07-25 13:26:03.077609] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:23.230 [2024-07-25 13:26:03.888893] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:23.230 [2024-07-25 13:26:03.888926] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:23.230 [2024-07-25 13:26:03.888932] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:23.230 [2024-07-25 13:26:03.888938] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:23.230 [2024-07-25 13:26:03.888943] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:23.230 [2024-07-25 13:26:03.888948] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:23.230 [2024-07-25 13:26:03.888952] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:23.230 [2024-07-25 13:26:03.888957] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:23.230 13:26:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:23.489 13:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:23.489 "name": "Existed_Raid", 00:17:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.490 "strip_size_kb": 64, 00:17:23.490 "state": "configuring", 00:17:23.490 "raid_level": "raid0", 00:17:23.490 "superblock": false, 00:17:23.490 "num_base_bdevs": 4, 00:17:23.490 "num_base_bdevs_discovered": 0, 00:17:23.490 "num_base_bdevs_operational": 4, 00:17:23.490 "base_bdevs_list": [ 00:17:23.490 { 00:17:23.490 "name": "BaseBdev1", 00:17:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.490 "is_configured": false, 00:17:23.490 "data_offset": 0, 00:17:23.490 "data_size": 0 00:17:23.490 }, 00:17:23.490 { 00:17:23.490 "name": "BaseBdev2", 00:17:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.490 "is_configured": false, 00:17:23.490 "data_offset": 0, 00:17:23.490 "data_size": 0 00:17:23.490 }, 00:17:23.490 { 00:17:23.490 "name": "BaseBdev3", 00:17:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.490 "is_configured": false, 00:17:23.490 "data_offset": 0, 00:17:23.490 "data_size": 0 00:17:23.490 }, 00:17:23.490 { 00:17:23.490 "name": "BaseBdev4", 00:17:23.490 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:23.490 "is_configured": false, 00:17:23.490 "data_offset": 0, 00:17:23.490 "data_size": 0 00:17:23.490 } 00:17:23.490 ] 00:17:23.490 }' 00:17:23.490 13:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:23.490 13:26:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:24.058 13:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:24.058 [2024-07-25 13:26:04.811119] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:24.058 [2024-07-25 13:26:04.811146] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xae16f0 name Existed_Raid, state configuring 00:17:24.058 13:26:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:24.319 [2024-07-25 13:26:04.999627] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:24.319 [2024-07-25 13:26:04.999663] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:24.319 [2024-07-25 13:26:04.999669] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:24.319 [2024-07-25 13:26:04.999674] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:24.319 [2024-07-25 13:26:04.999679] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:24.319 [2024-07-25 13:26:04.999684] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:24.319 [2024-07-25 13:26:04.999689] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:24.319 [2024-07-25 13:26:04.999694] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:24.319 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:24.579 [2024-07-25 13:26:05.198864] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:24.579 BaseBdev1 00:17:24.579 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:24.579 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:24.579 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:24.579 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:24.579 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:24.579 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:24.579 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:24.837 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:24.837 [ 00:17:24.837 { 00:17:24.837 "name": "BaseBdev1", 00:17:24.837 "aliases": [ 00:17:24.837 "5d072842-79ce-453c-a551-9842c95c6f95" 00:17:24.837 ], 00:17:24.837 "product_name": "Malloc disk", 00:17:24.837 "block_size": 512, 00:17:24.837 "num_blocks": 65536, 00:17:24.837 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:24.837 "assigned_rate_limits": { 00:17:24.837 "rw_ios_per_sec": 0, 00:17:24.837 "rw_mbytes_per_sec": 0, 00:17:24.837 "r_mbytes_per_sec": 0, 00:17:24.837 "w_mbytes_per_sec": 0 00:17:24.837 }, 00:17:24.837 "claimed": true, 00:17:24.837 "claim_type": "exclusive_write", 00:17:24.837 "zoned": false, 00:17:24.837 "supported_io_types": { 00:17:24.837 "read": true, 00:17:24.837 "write": true, 00:17:24.837 "unmap": true, 00:17:24.837 "flush": true, 00:17:24.837 "reset": true, 00:17:24.837 "nvme_admin": false, 00:17:24.837 "nvme_io": false, 00:17:24.837 "nvme_io_md": false, 00:17:24.837 "write_zeroes": true, 00:17:24.837 "zcopy": true, 00:17:24.837 "get_zone_info": false, 00:17:24.837 "zone_management": false, 00:17:24.837 "zone_append": false, 00:17:24.838 "compare": false, 00:17:24.838 "compare_and_write": false, 00:17:24.838 "abort": true, 00:17:24.838 "seek_hole": false, 00:17:24.838 "seek_data": false, 00:17:24.838 "copy": true, 00:17:24.838 "nvme_iov_md": false 00:17:24.838 }, 00:17:24.838 "memory_domains": [ 00:17:24.838 { 00:17:24.838 "dma_device_id": "system", 00:17:24.838 "dma_device_type": 1 00:17:24.838 }, 00:17:24.838 { 00:17:24.838 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.838 "dma_device_type": 2 00:17:24.838 } 00:17:24.838 ], 00:17:24.838 "driver_specific": {} 00:17:24.838 } 00:17:24.838 ] 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:24.838 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:25.097 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:25.097 "name": "Existed_Raid", 00:17:25.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.097 "strip_size_kb": 64, 00:17:25.097 "state": "configuring", 00:17:25.097 "raid_level": "raid0", 00:17:25.097 "superblock": false, 00:17:25.097 "num_base_bdevs": 4, 00:17:25.097 "num_base_bdevs_discovered": 1, 00:17:25.097 "num_base_bdevs_operational": 4, 00:17:25.097 "base_bdevs_list": [ 00:17:25.097 { 00:17:25.097 "name": "BaseBdev1", 00:17:25.097 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:25.097 "is_configured": true, 00:17:25.097 "data_offset": 0, 00:17:25.097 "data_size": 65536 00:17:25.097 }, 00:17:25.097 { 00:17:25.097 "name": "BaseBdev2", 00:17:25.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.097 "is_configured": false, 00:17:25.097 "data_offset": 0, 00:17:25.097 "data_size": 0 00:17:25.097 }, 00:17:25.097 { 00:17:25.097 "name": "BaseBdev3", 00:17:25.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.097 "is_configured": false, 00:17:25.097 "data_offset": 0, 00:17:25.097 "data_size": 0 00:17:25.097 }, 00:17:25.097 { 00:17:25.097 "name": "BaseBdev4", 00:17:25.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:25.097 "is_configured": false, 00:17:25.097 "data_offset": 0, 00:17:25.097 "data_size": 0 00:17:25.097 } 00:17:25.097 ] 00:17:25.097 }' 00:17:25.097 13:26:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:25.097 13:26:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:25.666 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:25.926 [2024-07-25 13:26:06.526219] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:25.926 [2024-07-25 13:26:06.526250] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xae0f60 name Existed_Raid, state configuring 00:17:25.926 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:26.187 [2024-07-25 13:26:06.718732] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:26.187 [2024-07-25 13:26:06.719884] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:26.187 [2024-07-25 13:26:06.719911] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:26.187 [2024-07-25 13:26:06.719918] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:26.187 [2024-07-25 13:26:06.719923] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:26.187 [2024-07-25 13:26:06.719928] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:26.187 [2024-07-25 13:26:06.719934] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:26.187 "name": "Existed_Raid", 00:17:26.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.187 "strip_size_kb": 64, 00:17:26.187 "state": "configuring", 00:17:26.187 "raid_level": "raid0", 00:17:26.187 "superblock": false, 00:17:26.187 "num_base_bdevs": 4, 00:17:26.187 "num_base_bdevs_discovered": 1, 00:17:26.187 "num_base_bdevs_operational": 4, 00:17:26.187 "base_bdevs_list": [ 00:17:26.187 { 00:17:26.187 "name": "BaseBdev1", 00:17:26.187 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:26.187 "is_configured": true, 00:17:26.187 "data_offset": 0, 00:17:26.187 "data_size": 65536 00:17:26.187 }, 00:17:26.187 { 00:17:26.187 "name": "BaseBdev2", 00:17:26.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.187 "is_configured": false, 00:17:26.187 "data_offset": 0, 00:17:26.187 "data_size": 0 00:17:26.187 }, 00:17:26.187 { 00:17:26.187 "name": "BaseBdev3", 00:17:26.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.187 "is_configured": false, 00:17:26.187 "data_offset": 0, 00:17:26.187 "data_size": 0 00:17:26.187 }, 00:17:26.187 { 00:17:26.187 "name": "BaseBdev4", 00:17:26.187 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:26.187 "is_configured": false, 00:17:26.187 "data_offset": 0, 00:17:26.187 "data_size": 0 00:17:26.187 } 00:17:26.187 ] 00:17:26.187 }' 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:26.187 13:26:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:26.758 13:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:27.018 [2024-07-25 13:26:07.658132] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:27.018 BaseBdev2 00:17:27.018 13:26:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:27.018 13:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:27.018 13:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:27.018 13:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:27.018 13:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:27.018 13:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:27.018 13:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:27.278 13:26:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:27.278 [ 00:17:27.278 { 00:17:27.278 "name": "BaseBdev2", 00:17:27.278 "aliases": [ 00:17:27.278 "e785e483-1d3f-4cc1-bc86-7481f22b58d5" 00:17:27.278 ], 00:17:27.278 "product_name": "Malloc disk", 00:17:27.278 "block_size": 512, 00:17:27.278 "num_blocks": 65536, 00:17:27.278 "uuid": "e785e483-1d3f-4cc1-bc86-7481f22b58d5", 00:17:27.278 "assigned_rate_limits": { 00:17:27.278 "rw_ios_per_sec": 0, 00:17:27.278 "rw_mbytes_per_sec": 0, 00:17:27.278 "r_mbytes_per_sec": 0, 00:17:27.278 "w_mbytes_per_sec": 0 00:17:27.278 }, 00:17:27.278 "claimed": true, 00:17:27.278 "claim_type": "exclusive_write", 00:17:27.278 "zoned": false, 00:17:27.278 "supported_io_types": { 00:17:27.278 "read": true, 00:17:27.278 "write": true, 00:17:27.278 "unmap": true, 00:17:27.278 "flush": true, 00:17:27.278 "reset": true, 00:17:27.278 "nvme_admin": false, 00:17:27.278 "nvme_io": false, 00:17:27.278 "nvme_io_md": false, 00:17:27.278 "write_zeroes": true, 00:17:27.278 "zcopy": true, 00:17:27.278 "get_zone_info": false, 00:17:27.278 "zone_management": false, 00:17:27.278 "zone_append": false, 00:17:27.278 "compare": false, 00:17:27.278 "compare_and_write": false, 00:17:27.278 "abort": true, 00:17:27.278 "seek_hole": false, 00:17:27.278 "seek_data": false, 00:17:27.278 "copy": true, 00:17:27.278 "nvme_iov_md": false 00:17:27.278 }, 00:17:27.278 "memory_domains": [ 00:17:27.278 { 00:17:27.278 "dma_device_id": "system", 00:17:27.278 "dma_device_type": 1 00:17:27.278 }, 00:17:27.278 { 00:17:27.278 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:27.278 "dma_device_type": 2 00:17:27.278 } 00:17:27.278 ], 00:17:27.278 "driver_specific": {} 00:17:27.278 } 00:17:27.278 ] 00:17:27.278 13:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:27.278 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:27.278 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:27.278 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:27.278 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:27.279 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:27.539 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:27.539 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:27.539 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:27.539 "name": "Existed_Raid", 00:17:27.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.539 "strip_size_kb": 64, 00:17:27.539 "state": "configuring", 00:17:27.539 "raid_level": "raid0", 00:17:27.539 "superblock": false, 00:17:27.539 "num_base_bdevs": 4, 00:17:27.539 "num_base_bdevs_discovered": 2, 00:17:27.539 "num_base_bdevs_operational": 4, 00:17:27.539 "base_bdevs_list": [ 00:17:27.539 { 00:17:27.539 "name": "BaseBdev1", 00:17:27.539 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:27.539 "is_configured": true, 00:17:27.539 "data_offset": 0, 00:17:27.539 "data_size": 65536 00:17:27.539 }, 00:17:27.539 { 00:17:27.539 "name": "BaseBdev2", 00:17:27.539 "uuid": "e785e483-1d3f-4cc1-bc86-7481f22b58d5", 00:17:27.539 "is_configured": true, 00:17:27.539 "data_offset": 0, 00:17:27.539 "data_size": 65536 00:17:27.539 }, 00:17:27.539 { 00:17:27.539 "name": "BaseBdev3", 00:17:27.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.539 "is_configured": false, 00:17:27.539 "data_offset": 0, 00:17:27.539 "data_size": 0 00:17:27.539 }, 00:17:27.539 { 00:17:27.539 "name": "BaseBdev4", 00:17:27.539 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:27.539 "is_configured": false, 00:17:27.539 "data_offset": 0, 00:17:27.539 "data_size": 0 00:17:27.539 } 00:17:27.539 ] 00:17:27.539 }' 00:17:27.539 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:27.539 13:26:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:28.110 13:26:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:28.371 [2024-07-25 13:26:08.986534] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:28.371 BaseBdev3 00:17:28.371 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:28.371 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:28.371 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:28.371 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:28.371 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:28.371 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:28.371 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:28.632 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:28.632 [ 00:17:28.632 { 00:17:28.632 "name": "BaseBdev3", 00:17:28.632 "aliases": [ 00:17:28.632 "5d73c821-2622-4644-abe7-88dcb3354ff3" 00:17:28.632 ], 00:17:28.632 "product_name": "Malloc disk", 00:17:28.632 "block_size": 512, 00:17:28.632 "num_blocks": 65536, 00:17:28.632 "uuid": "5d73c821-2622-4644-abe7-88dcb3354ff3", 00:17:28.632 "assigned_rate_limits": { 00:17:28.632 "rw_ios_per_sec": 0, 00:17:28.632 "rw_mbytes_per_sec": 0, 00:17:28.632 "r_mbytes_per_sec": 0, 00:17:28.632 "w_mbytes_per_sec": 0 00:17:28.632 }, 00:17:28.632 "claimed": true, 00:17:28.632 "claim_type": "exclusive_write", 00:17:28.632 "zoned": false, 00:17:28.632 "supported_io_types": { 00:17:28.632 "read": true, 00:17:28.632 "write": true, 00:17:28.632 "unmap": true, 00:17:28.632 "flush": true, 00:17:28.632 "reset": true, 00:17:28.632 "nvme_admin": false, 00:17:28.632 "nvme_io": false, 00:17:28.632 "nvme_io_md": false, 00:17:28.632 "write_zeroes": true, 00:17:28.632 "zcopy": true, 00:17:28.632 "get_zone_info": false, 00:17:28.632 "zone_management": false, 00:17:28.632 "zone_append": false, 00:17:28.632 "compare": false, 00:17:28.632 "compare_and_write": false, 00:17:28.632 "abort": true, 00:17:28.632 "seek_hole": false, 00:17:28.632 "seek_data": false, 00:17:28.632 "copy": true, 00:17:28.632 "nvme_iov_md": false 00:17:28.632 }, 00:17:28.632 "memory_domains": [ 00:17:28.632 { 00:17:28.632 "dma_device_id": "system", 00:17:28.632 "dma_device_type": 1 00:17:28.632 }, 00:17:28.632 { 00:17:28.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:28.633 "dma_device_type": 2 00:17:28.633 } 00:17:28.633 ], 00:17:28.633 "driver_specific": {} 00:17:28.633 } 00:17:28.633 ] 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:28.633 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:28.893 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:28.893 "name": "Existed_Raid", 00:17:28.893 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.893 "strip_size_kb": 64, 00:17:28.893 "state": "configuring", 00:17:28.893 "raid_level": "raid0", 00:17:28.893 "superblock": false, 00:17:28.893 "num_base_bdevs": 4, 00:17:28.893 "num_base_bdevs_discovered": 3, 00:17:28.893 "num_base_bdevs_operational": 4, 00:17:28.893 "base_bdevs_list": [ 00:17:28.893 { 00:17:28.893 "name": "BaseBdev1", 00:17:28.893 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:28.894 "is_configured": true, 00:17:28.894 "data_offset": 0, 00:17:28.894 "data_size": 65536 00:17:28.894 }, 00:17:28.894 { 00:17:28.894 "name": "BaseBdev2", 00:17:28.894 "uuid": "e785e483-1d3f-4cc1-bc86-7481f22b58d5", 00:17:28.894 "is_configured": true, 00:17:28.894 "data_offset": 0, 00:17:28.894 "data_size": 65536 00:17:28.894 }, 00:17:28.894 { 00:17:28.894 "name": "BaseBdev3", 00:17:28.894 "uuid": "5d73c821-2622-4644-abe7-88dcb3354ff3", 00:17:28.894 "is_configured": true, 00:17:28.894 "data_offset": 0, 00:17:28.894 "data_size": 65536 00:17:28.894 }, 00:17:28.894 { 00:17:28.894 "name": "BaseBdev4", 00:17:28.894 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:28.894 "is_configured": false, 00:17:28.894 "data_offset": 0, 00:17:28.894 "data_size": 0 00:17:28.894 } 00:17:28.894 ] 00:17:28.894 }' 00:17:28.894 13:26:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:28.894 13:26:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:29.464 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:29.724 [2024-07-25 13:26:10.322867] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:29.724 [2024-07-25 13:26:10.322894] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xae1fd0 00:17:29.724 [2024-07-25 13:26:10.322899] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:29.724 [2024-07-25 13:26:10.323048] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc868e0 00:17:29.724 [2024-07-25 13:26:10.323143] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xae1fd0 00:17:29.724 [2024-07-25 13:26:10.323149] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xae1fd0 00:17:29.724 [2024-07-25 13:26:10.323275] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:29.724 BaseBdev4 00:17:29.724 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:29.724 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:17:29.724 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:29.724 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:29.724 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:29.724 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:29.724 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:29.984 [ 00:17:29.984 { 00:17:29.984 "name": "BaseBdev4", 00:17:29.984 "aliases": [ 00:17:29.984 "c8e8b889-5f26-421f-98e2-60b2ee67906e" 00:17:29.984 ], 00:17:29.984 "product_name": "Malloc disk", 00:17:29.984 "block_size": 512, 00:17:29.984 "num_blocks": 65536, 00:17:29.984 "uuid": "c8e8b889-5f26-421f-98e2-60b2ee67906e", 00:17:29.984 "assigned_rate_limits": { 00:17:29.984 "rw_ios_per_sec": 0, 00:17:29.984 "rw_mbytes_per_sec": 0, 00:17:29.984 "r_mbytes_per_sec": 0, 00:17:29.984 "w_mbytes_per_sec": 0 00:17:29.984 }, 00:17:29.984 "claimed": true, 00:17:29.984 "claim_type": "exclusive_write", 00:17:29.984 "zoned": false, 00:17:29.984 "supported_io_types": { 00:17:29.984 "read": true, 00:17:29.984 "write": true, 00:17:29.984 "unmap": true, 00:17:29.984 "flush": true, 00:17:29.984 "reset": true, 00:17:29.984 "nvme_admin": false, 00:17:29.984 "nvme_io": false, 00:17:29.984 "nvme_io_md": false, 00:17:29.984 "write_zeroes": true, 00:17:29.984 "zcopy": true, 00:17:29.984 "get_zone_info": false, 00:17:29.984 "zone_management": false, 00:17:29.984 "zone_append": false, 00:17:29.984 "compare": false, 00:17:29.984 "compare_and_write": false, 00:17:29.984 "abort": true, 00:17:29.984 "seek_hole": false, 00:17:29.984 "seek_data": false, 00:17:29.984 "copy": true, 00:17:29.984 "nvme_iov_md": false 00:17:29.984 }, 00:17:29.984 "memory_domains": [ 00:17:29.984 { 00:17:29.984 "dma_device_id": "system", 00:17:29.984 "dma_device_type": 1 00:17:29.984 }, 00:17:29.984 { 00:17:29.984 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:29.984 "dma_device_type": 2 00:17:29.984 } 00:17:29.984 ], 00:17:29.984 "driver_specific": {} 00:17:29.984 } 00:17:29.984 ] 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:29.984 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:30.244 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.244 "name": "Existed_Raid", 00:17:30.244 "uuid": "c8ec74ef-4828-4233-8134-7a1307c4382b", 00:17:30.244 "strip_size_kb": 64, 00:17:30.244 "state": "online", 00:17:30.244 "raid_level": "raid0", 00:17:30.244 "superblock": false, 00:17:30.244 "num_base_bdevs": 4, 00:17:30.244 "num_base_bdevs_discovered": 4, 00:17:30.244 "num_base_bdevs_operational": 4, 00:17:30.244 "base_bdevs_list": [ 00:17:30.244 { 00:17:30.244 "name": "BaseBdev1", 00:17:30.244 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:30.244 "is_configured": true, 00:17:30.244 "data_offset": 0, 00:17:30.244 "data_size": 65536 00:17:30.244 }, 00:17:30.244 { 00:17:30.244 "name": "BaseBdev2", 00:17:30.244 "uuid": "e785e483-1d3f-4cc1-bc86-7481f22b58d5", 00:17:30.244 "is_configured": true, 00:17:30.244 "data_offset": 0, 00:17:30.244 "data_size": 65536 00:17:30.244 }, 00:17:30.244 { 00:17:30.244 "name": "BaseBdev3", 00:17:30.244 "uuid": "5d73c821-2622-4644-abe7-88dcb3354ff3", 00:17:30.244 "is_configured": true, 00:17:30.244 "data_offset": 0, 00:17:30.244 "data_size": 65536 00:17:30.244 }, 00:17:30.244 { 00:17:30.244 "name": "BaseBdev4", 00:17:30.244 "uuid": "c8e8b889-5f26-421f-98e2-60b2ee67906e", 00:17:30.244 "is_configured": true, 00:17:30.244 "data_offset": 0, 00:17:30.244 "data_size": 65536 00:17:30.244 } 00:17:30.244 ] 00:17:30.244 }' 00:17:30.244 13:26:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.244 13:26:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:30.814 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:31.073 [2024-07-25 13:26:11.666519] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:31.073 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:31.073 "name": "Existed_Raid", 00:17:31.073 "aliases": [ 00:17:31.073 "c8ec74ef-4828-4233-8134-7a1307c4382b" 00:17:31.073 ], 00:17:31.073 "product_name": "Raid Volume", 00:17:31.073 "block_size": 512, 00:17:31.073 "num_blocks": 262144, 00:17:31.073 "uuid": "c8ec74ef-4828-4233-8134-7a1307c4382b", 00:17:31.073 "assigned_rate_limits": { 00:17:31.073 "rw_ios_per_sec": 0, 00:17:31.073 "rw_mbytes_per_sec": 0, 00:17:31.073 "r_mbytes_per_sec": 0, 00:17:31.073 "w_mbytes_per_sec": 0 00:17:31.073 }, 00:17:31.073 "claimed": false, 00:17:31.073 "zoned": false, 00:17:31.073 "supported_io_types": { 00:17:31.073 "read": true, 00:17:31.073 "write": true, 00:17:31.073 "unmap": true, 00:17:31.073 "flush": true, 00:17:31.073 "reset": true, 00:17:31.073 "nvme_admin": false, 00:17:31.073 "nvme_io": false, 00:17:31.073 "nvme_io_md": false, 00:17:31.073 "write_zeroes": true, 00:17:31.073 "zcopy": false, 00:17:31.073 "get_zone_info": false, 00:17:31.073 "zone_management": false, 00:17:31.073 "zone_append": false, 00:17:31.073 "compare": false, 00:17:31.073 "compare_and_write": false, 00:17:31.074 "abort": false, 00:17:31.074 "seek_hole": false, 00:17:31.074 "seek_data": false, 00:17:31.074 "copy": false, 00:17:31.074 "nvme_iov_md": false 00:17:31.074 }, 00:17:31.074 "memory_domains": [ 00:17:31.074 { 00:17:31.074 "dma_device_id": "system", 00:17:31.074 "dma_device_type": 1 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.074 "dma_device_type": 2 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "dma_device_id": "system", 00:17:31.074 "dma_device_type": 1 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.074 "dma_device_type": 2 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "dma_device_id": "system", 00:17:31.074 "dma_device_type": 1 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.074 "dma_device_type": 2 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "dma_device_id": "system", 00:17:31.074 "dma_device_type": 1 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.074 "dma_device_type": 2 00:17:31.074 } 00:17:31.074 ], 00:17:31.074 "driver_specific": { 00:17:31.074 "raid": { 00:17:31.074 "uuid": "c8ec74ef-4828-4233-8134-7a1307c4382b", 00:17:31.074 "strip_size_kb": 64, 00:17:31.074 "state": "online", 00:17:31.074 "raid_level": "raid0", 00:17:31.074 "superblock": false, 00:17:31.074 "num_base_bdevs": 4, 00:17:31.074 "num_base_bdevs_discovered": 4, 00:17:31.074 "num_base_bdevs_operational": 4, 00:17:31.074 "base_bdevs_list": [ 00:17:31.074 { 00:17:31.074 "name": "BaseBdev1", 00:17:31.074 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:31.074 "is_configured": true, 00:17:31.074 "data_offset": 0, 00:17:31.074 "data_size": 65536 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "name": "BaseBdev2", 00:17:31.074 "uuid": "e785e483-1d3f-4cc1-bc86-7481f22b58d5", 00:17:31.074 "is_configured": true, 00:17:31.074 "data_offset": 0, 00:17:31.074 "data_size": 65536 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "name": "BaseBdev3", 00:17:31.074 "uuid": "5d73c821-2622-4644-abe7-88dcb3354ff3", 00:17:31.074 "is_configured": true, 00:17:31.074 "data_offset": 0, 00:17:31.074 "data_size": 65536 00:17:31.074 }, 00:17:31.074 { 00:17:31.074 "name": "BaseBdev4", 00:17:31.074 "uuid": "c8e8b889-5f26-421f-98e2-60b2ee67906e", 00:17:31.074 "is_configured": true, 00:17:31.074 "data_offset": 0, 00:17:31.074 "data_size": 65536 00:17:31.074 } 00:17:31.074 ] 00:17:31.074 } 00:17:31.074 } 00:17:31.074 }' 00:17:31.074 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:31.074 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:31.074 BaseBdev2 00:17:31.074 BaseBdev3 00:17:31.074 BaseBdev4' 00:17:31.074 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.074 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:31.074 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.333 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.333 "name": "BaseBdev1", 00:17:31.334 "aliases": [ 00:17:31.334 "5d072842-79ce-453c-a551-9842c95c6f95" 00:17:31.334 ], 00:17:31.334 "product_name": "Malloc disk", 00:17:31.334 "block_size": 512, 00:17:31.334 "num_blocks": 65536, 00:17:31.334 "uuid": "5d072842-79ce-453c-a551-9842c95c6f95", 00:17:31.334 "assigned_rate_limits": { 00:17:31.334 "rw_ios_per_sec": 0, 00:17:31.334 "rw_mbytes_per_sec": 0, 00:17:31.334 "r_mbytes_per_sec": 0, 00:17:31.334 "w_mbytes_per_sec": 0 00:17:31.334 }, 00:17:31.334 "claimed": true, 00:17:31.334 "claim_type": "exclusive_write", 00:17:31.334 "zoned": false, 00:17:31.334 "supported_io_types": { 00:17:31.334 "read": true, 00:17:31.334 "write": true, 00:17:31.334 "unmap": true, 00:17:31.334 "flush": true, 00:17:31.334 "reset": true, 00:17:31.334 "nvme_admin": false, 00:17:31.334 "nvme_io": false, 00:17:31.334 "nvme_io_md": false, 00:17:31.334 "write_zeroes": true, 00:17:31.334 "zcopy": true, 00:17:31.334 "get_zone_info": false, 00:17:31.334 "zone_management": false, 00:17:31.334 "zone_append": false, 00:17:31.334 "compare": false, 00:17:31.334 "compare_and_write": false, 00:17:31.334 "abort": true, 00:17:31.334 "seek_hole": false, 00:17:31.334 "seek_data": false, 00:17:31.334 "copy": true, 00:17:31.334 "nvme_iov_md": false 00:17:31.334 }, 00:17:31.334 "memory_domains": [ 00:17:31.334 { 00:17:31.334 "dma_device_id": "system", 00:17:31.334 "dma_device_type": 1 00:17:31.334 }, 00:17:31.334 { 00:17:31.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.334 "dma_device_type": 2 00:17:31.334 } 00:17:31.334 ], 00:17:31.334 "driver_specific": {} 00:17:31.334 }' 00:17:31.334 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.334 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.334 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.334 13:26:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.334 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.334 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.334 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.334 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.594 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.594 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.594 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.594 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.594 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.594 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.594 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:31.853 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.853 "name": "BaseBdev2", 00:17:31.853 "aliases": [ 00:17:31.853 "e785e483-1d3f-4cc1-bc86-7481f22b58d5" 00:17:31.853 ], 00:17:31.853 "product_name": "Malloc disk", 00:17:31.853 "block_size": 512, 00:17:31.853 "num_blocks": 65536, 00:17:31.853 "uuid": "e785e483-1d3f-4cc1-bc86-7481f22b58d5", 00:17:31.853 "assigned_rate_limits": { 00:17:31.853 "rw_ios_per_sec": 0, 00:17:31.853 "rw_mbytes_per_sec": 0, 00:17:31.853 "r_mbytes_per_sec": 0, 00:17:31.853 "w_mbytes_per_sec": 0 00:17:31.853 }, 00:17:31.853 "claimed": true, 00:17:31.853 "claim_type": "exclusive_write", 00:17:31.853 "zoned": false, 00:17:31.853 "supported_io_types": { 00:17:31.853 "read": true, 00:17:31.853 "write": true, 00:17:31.853 "unmap": true, 00:17:31.853 "flush": true, 00:17:31.853 "reset": true, 00:17:31.853 "nvme_admin": false, 00:17:31.853 "nvme_io": false, 00:17:31.853 "nvme_io_md": false, 00:17:31.853 "write_zeroes": true, 00:17:31.853 "zcopy": true, 00:17:31.853 "get_zone_info": false, 00:17:31.853 "zone_management": false, 00:17:31.853 "zone_append": false, 00:17:31.853 "compare": false, 00:17:31.853 "compare_and_write": false, 00:17:31.853 "abort": true, 00:17:31.853 "seek_hole": false, 00:17:31.853 "seek_data": false, 00:17:31.853 "copy": true, 00:17:31.853 "nvme_iov_md": false 00:17:31.853 }, 00:17:31.853 "memory_domains": [ 00:17:31.853 { 00:17:31.853 "dma_device_id": "system", 00:17:31.853 "dma_device_type": 1 00:17:31.853 }, 00:17:31.853 { 00:17:31.853 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.853 "dma_device_type": 2 00:17:31.853 } 00:17:31.853 ], 00:17:31.853 "driver_specific": {} 00:17:31.853 }' 00:17:31.853 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.854 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.854 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.854 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.854 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.854 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.854 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:32.114 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.374 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.374 "name": "BaseBdev3", 00:17:32.374 "aliases": [ 00:17:32.374 "5d73c821-2622-4644-abe7-88dcb3354ff3" 00:17:32.374 ], 00:17:32.374 "product_name": "Malloc disk", 00:17:32.374 "block_size": 512, 00:17:32.374 "num_blocks": 65536, 00:17:32.374 "uuid": "5d73c821-2622-4644-abe7-88dcb3354ff3", 00:17:32.374 "assigned_rate_limits": { 00:17:32.374 "rw_ios_per_sec": 0, 00:17:32.374 "rw_mbytes_per_sec": 0, 00:17:32.374 "r_mbytes_per_sec": 0, 00:17:32.374 "w_mbytes_per_sec": 0 00:17:32.374 }, 00:17:32.374 "claimed": true, 00:17:32.374 "claim_type": "exclusive_write", 00:17:32.374 "zoned": false, 00:17:32.374 "supported_io_types": { 00:17:32.374 "read": true, 00:17:32.374 "write": true, 00:17:32.374 "unmap": true, 00:17:32.374 "flush": true, 00:17:32.374 "reset": true, 00:17:32.374 "nvme_admin": false, 00:17:32.374 "nvme_io": false, 00:17:32.374 "nvme_io_md": false, 00:17:32.374 "write_zeroes": true, 00:17:32.374 "zcopy": true, 00:17:32.374 "get_zone_info": false, 00:17:32.374 "zone_management": false, 00:17:32.374 "zone_append": false, 00:17:32.374 "compare": false, 00:17:32.374 "compare_and_write": false, 00:17:32.374 "abort": true, 00:17:32.374 "seek_hole": false, 00:17:32.374 "seek_data": false, 00:17:32.374 "copy": true, 00:17:32.374 "nvme_iov_md": false 00:17:32.374 }, 00:17:32.374 "memory_domains": [ 00:17:32.374 { 00:17:32.374 "dma_device_id": "system", 00:17:32.374 "dma_device_type": 1 00:17:32.374 }, 00:17:32.374 { 00:17:32.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.374 "dma_device_type": 2 00:17:32.374 } 00:17:32.374 ], 00:17:32.374 "driver_specific": {} 00:17:32.374 }' 00:17:32.374 13:26:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.374 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.374 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.374 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.374 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.374 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.374 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:32.634 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.894 "name": "BaseBdev4", 00:17:32.894 "aliases": [ 00:17:32.894 "c8e8b889-5f26-421f-98e2-60b2ee67906e" 00:17:32.894 ], 00:17:32.894 "product_name": "Malloc disk", 00:17:32.894 "block_size": 512, 00:17:32.894 "num_blocks": 65536, 00:17:32.894 "uuid": "c8e8b889-5f26-421f-98e2-60b2ee67906e", 00:17:32.894 "assigned_rate_limits": { 00:17:32.894 "rw_ios_per_sec": 0, 00:17:32.894 "rw_mbytes_per_sec": 0, 00:17:32.894 "r_mbytes_per_sec": 0, 00:17:32.894 "w_mbytes_per_sec": 0 00:17:32.894 }, 00:17:32.894 "claimed": true, 00:17:32.894 "claim_type": "exclusive_write", 00:17:32.894 "zoned": false, 00:17:32.894 "supported_io_types": { 00:17:32.894 "read": true, 00:17:32.894 "write": true, 00:17:32.894 "unmap": true, 00:17:32.894 "flush": true, 00:17:32.894 "reset": true, 00:17:32.894 "nvme_admin": false, 00:17:32.894 "nvme_io": false, 00:17:32.894 "nvme_io_md": false, 00:17:32.894 "write_zeroes": true, 00:17:32.894 "zcopy": true, 00:17:32.894 "get_zone_info": false, 00:17:32.894 "zone_management": false, 00:17:32.894 "zone_append": false, 00:17:32.894 "compare": false, 00:17:32.894 "compare_and_write": false, 00:17:32.894 "abort": true, 00:17:32.894 "seek_hole": false, 00:17:32.894 "seek_data": false, 00:17:32.894 "copy": true, 00:17:32.894 "nvme_iov_md": false 00:17:32.894 }, 00:17:32.894 "memory_domains": [ 00:17:32.894 { 00:17:32.894 "dma_device_id": "system", 00:17:32.894 "dma_device_type": 1 00:17:32.894 }, 00:17:32.894 { 00:17:32.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.894 "dma_device_type": 2 00:17:32.894 } 00:17:32.894 ], 00:17:32.894 "driver_specific": {} 00:17:32.894 }' 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.894 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.153 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.154 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.154 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.154 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.154 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.154 13:26:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:33.413 [2024-07-25 13:26:14.016259] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:33.413 [2024-07-25 13:26:14.016280] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:33.413 [2024-07-25 13:26:14.016317] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:33.413 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:33.414 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:33.414 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:33.414 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:33.414 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:33.414 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:33.673 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:33.673 "name": "Existed_Raid", 00:17:33.673 "uuid": "c8ec74ef-4828-4233-8134-7a1307c4382b", 00:17:33.673 "strip_size_kb": 64, 00:17:33.673 "state": "offline", 00:17:33.673 "raid_level": "raid0", 00:17:33.673 "superblock": false, 00:17:33.673 "num_base_bdevs": 4, 00:17:33.673 "num_base_bdevs_discovered": 3, 00:17:33.673 "num_base_bdevs_operational": 3, 00:17:33.673 "base_bdevs_list": [ 00:17:33.673 { 00:17:33.673 "name": null, 00:17:33.673 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:33.673 "is_configured": false, 00:17:33.673 "data_offset": 0, 00:17:33.673 "data_size": 65536 00:17:33.673 }, 00:17:33.673 { 00:17:33.673 "name": "BaseBdev2", 00:17:33.673 "uuid": "e785e483-1d3f-4cc1-bc86-7481f22b58d5", 00:17:33.673 "is_configured": true, 00:17:33.673 "data_offset": 0, 00:17:33.673 "data_size": 65536 00:17:33.673 }, 00:17:33.673 { 00:17:33.673 "name": "BaseBdev3", 00:17:33.673 "uuid": "5d73c821-2622-4644-abe7-88dcb3354ff3", 00:17:33.673 "is_configured": true, 00:17:33.673 "data_offset": 0, 00:17:33.673 "data_size": 65536 00:17:33.673 }, 00:17:33.673 { 00:17:33.673 "name": "BaseBdev4", 00:17:33.673 "uuid": "c8e8b889-5f26-421f-98e2-60b2ee67906e", 00:17:33.673 "is_configured": true, 00:17:33.673 "data_offset": 0, 00:17:33.673 "data_size": 65536 00:17:33.673 } 00:17:33.673 ] 00:17:33.673 }' 00:17:33.673 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:33.674 13:26:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:34.243 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:34.243 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.243 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:34.243 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.243 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:34.243 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:34.243 13:26:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:34.503 [2024-07-25 13:26:15.107032] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:34.503 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:34.503 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.503 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.503 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:34.763 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:34.763 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:34.763 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:34.763 [2024-07-25 13:26:15.497916] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:34.763 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:34.763 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:34.763 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.763 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:35.022 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:35.022 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:35.022 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:35.282 [2024-07-25 13:26:15.884703] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:35.282 [2024-07-25 13:26:15.884732] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xae1fd0 name Existed_Raid, state offline 00:17:35.282 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:35.282 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:35.282 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:35.282 13:26:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:35.542 BaseBdev2 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:35.542 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:35.801 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:36.061 [ 00:17:36.061 { 00:17:36.061 "name": "BaseBdev2", 00:17:36.061 "aliases": [ 00:17:36.061 "a0879f39-a081-451c-8a88-32bb3bfa2b66" 00:17:36.061 ], 00:17:36.061 "product_name": "Malloc disk", 00:17:36.061 "block_size": 512, 00:17:36.061 "num_blocks": 65536, 00:17:36.061 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:36.061 "assigned_rate_limits": { 00:17:36.061 "rw_ios_per_sec": 0, 00:17:36.061 "rw_mbytes_per_sec": 0, 00:17:36.061 "r_mbytes_per_sec": 0, 00:17:36.061 "w_mbytes_per_sec": 0 00:17:36.061 }, 00:17:36.061 "claimed": false, 00:17:36.061 "zoned": false, 00:17:36.061 "supported_io_types": { 00:17:36.061 "read": true, 00:17:36.061 "write": true, 00:17:36.061 "unmap": true, 00:17:36.061 "flush": true, 00:17:36.061 "reset": true, 00:17:36.062 "nvme_admin": false, 00:17:36.062 "nvme_io": false, 00:17:36.062 "nvme_io_md": false, 00:17:36.062 "write_zeroes": true, 00:17:36.062 "zcopy": true, 00:17:36.062 "get_zone_info": false, 00:17:36.062 "zone_management": false, 00:17:36.062 "zone_append": false, 00:17:36.062 "compare": false, 00:17:36.062 "compare_and_write": false, 00:17:36.062 "abort": true, 00:17:36.062 "seek_hole": false, 00:17:36.062 "seek_data": false, 00:17:36.062 "copy": true, 00:17:36.062 "nvme_iov_md": false 00:17:36.062 }, 00:17:36.062 "memory_domains": [ 00:17:36.062 { 00:17:36.062 "dma_device_id": "system", 00:17:36.062 "dma_device_type": 1 00:17:36.062 }, 00:17:36.062 { 00:17:36.062 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.062 "dma_device_type": 2 00:17:36.062 } 00:17:36.062 ], 00:17:36.062 "driver_specific": {} 00:17:36.062 } 00:17:36.062 ] 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:36.062 BaseBdev3 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:36.062 13:26:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:36.321 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:36.582 [ 00:17:36.583 { 00:17:36.583 "name": "BaseBdev3", 00:17:36.583 "aliases": [ 00:17:36.583 "e7ac7497-f460-4b9f-b161-bb719da898ea" 00:17:36.583 ], 00:17:36.583 "product_name": "Malloc disk", 00:17:36.583 "block_size": 512, 00:17:36.583 "num_blocks": 65536, 00:17:36.583 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:36.583 "assigned_rate_limits": { 00:17:36.583 "rw_ios_per_sec": 0, 00:17:36.583 "rw_mbytes_per_sec": 0, 00:17:36.583 "r_mbytes_per_sec": 0, 00:17:36.583 "w_mbytes_per_sec": 0 00:17:36.583 }, 00:17:36.583 "claimed": false, 00:17:36.583 "zoned": false, 00:17:36.583 "supported_io_types": { 00:17:36.583 "read": true, 00:17:36.583 "write": true, 00:17:36.583 "unmap": true, 00:17:36.583 "flush": true, 00:17:36.583 "reset": true, 00:17:36.583 "nvme_admin": false, 00:17:36.583 "nvme_io": false, 00:17:36.583 "nvme_io_md": false, 00:17:36.583 "write_zeroes": true, 00:17:36.583 "zcopy": true, 00:17:36.583 "get_zone_info": false, 00:17:36.583 "zone_management": false, 00:17:36.583 "zone_append": false, 00:17:36.583 "compare": false, 00:17:36.583 "compare_and_write": false, 00:17:36.583 "abort": true, 00:17:36.583 "seek_hole": false, 00:17:36.583 "seek_data": false, 00:17:36.583 "copy": true, 00:17:36.583 "nvme_iov_md": false 00:17:36.583 }, 00:17:36.583 "memory_domains": [ 00:17:36.583 { 00:17:36.583 "dma_device_id": "system", 00:17:36.583 "dma_device_type": 1 00:17:36.583 }, 00:17:36.583 { 00:17:36.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:36.583 "dma_device_type": 2 00:17:36.583 } 00:17:36.583 ], 00:17:36.583 "driver_specific": {} 00:17:36.583 } 00:17:36.583 ] 00:17:36.583 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:36.583 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:36.583 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:36.583 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:36.843 BaseBdev4 00:17:36.843 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:36.843 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:17:36.844 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:36.844 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:36.844 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:36.844 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:36.844 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:36.844 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:37.103 [ 00:17:37.103 { 00:17:37.103 "name": "BaseBdev4", 00:17:37.103 "aliases": [ 00:17:37.103 "8474296d-8327-41cf-b13a-c88c7c116c62" 00:17:37.103 ], 00:17:37.103 "product_name": "Malloc disk", 00:17:37.103 "block_size": 512, 00:17:37.103 "num_blocks": 65536, 00:17:37.103 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:37.103 "assigned_rate_limits": { 00:17:37.103 "rw_ios_per_sec": 0, 00:17:37.103 "rw_mbytes_per_sec": 0, 00:17:37.103 "r_mbytes_per_sec": 0, 00:17:37.103 "w_mbytes_per_sec": 0 00:17:37.103 }, 00:17:37.103 "claimed": false, 00:17:37.103 "zoned": false, 00:17:37.103 "supported_io_types": { 00:17:37.103 "read": true, 00:17:37.103 "write": true, 00:17:37.103 "unmap": true, 00:17:37.103 "flush": true, 00:17:37.103 "reset": true, 00:17:37.103 "nvme_admin": false, 00:17:37.103 "nvme_io": false, 00:17:37.103 "nvme_io_md": false, 00:17:37.103 "write_zeroes": true, 00:17:37.103 "zcopy": true, 00:17:37.103 "get_zone_info": false, 00:17:37.103 "zone_management": false, 00:17:37.103 "zone_append": false, 00:17:37.103 "compare": false, 00:17:37.103 "compare_and_write": false, 00:17:37.103 "abort": true, 00:17:37.103 "seek_hole": false, 00:17:37.103 "seek_data": false, 00:17:37.103 "copy": true, 00:17:37.103 "nvme_iov_md": false 00:17:37.103 }, 00:17:37.103 "memory_domains": [ 00:17:37.103 { 00:17:37.103 "dma_device_id": "system", 00:17:37.103 "dma_device_type": 1 00:17:37.103 }, 00:17:37.103 { 00:17:37.103 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:37.103 "dma_device_type": 2 00:17:37.103 } 00:17:37.103 ], 00:17:37.103 "driver_specific": {} 00:17:37.103 } 00:17:37.103 ] 00:17:37.103 13:26:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:37.103 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:37.104 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:37.104 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:37.362 [2024-07-25 13:26:17.963815] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:37.362 [2024-07-25 13:26:17.963848] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:37.362 [2024-07-25 13:26:17.963861] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:37.362 [2024-07-25 13:26:17.964903] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:37.362 [2024-07-25 13:26:17.964936] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:37.362 13:26:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:37.621 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:37.621 "name": "Existed_Raid", 00:17:37.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.621 "strip_size_kb": 64, 00:17:37.621 "state": "configuring", 00:17:37.621 "raid_level": "raid0", 00:17:37.621 "superblock": false, 00:17:37.621 "num_base_bdevs": 4, 00:17:37.621 "num_base_bdevs_discovered": 3, 00:17:37.621 "num_base_bdevs_operational": 4, 00:17:37.621 "base_bdevs_list": [ 00:17:37.621 { 00:17:37.621 "name": "BaseBdev1", 00:17:37.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:37.621 "is_configured": false, 00:17:37.621 "data_offset": 0, 00:17:37.621 "data_size": 0 00:17:37.621 }, 00:17:37.621 { 00:17:37.621 "name": "BaseBdev2", 00:17:37.621 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:37.621 "is_configured": true, 00:17:37.621 "data_offset": 0, 00:17:37.621 "data_size": 65536 00:17:37.621 }, 00:17:37.621 { 00:17:37.621 "name": "BaseBdev3", 00:17:37.621 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:37.621 "is_configured": true, 00:17:37.621 "data_offset": 0, 00:17:37.621 "data_size": 65536 00:17:37.621 }, 00:17:37.621 { 00:17:37.621 "name": "BaseBdev4", 00:17:37.621 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:37.621 "is_configured": true, 00:17:37.621 "data_offset": 0, 00:17:37.621 "data_size": 65536 00:17:37.621 } 00:17:37.621 ] 00:17:37.621 }' 00:17:37.621 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:37.621 13:26:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:38.190 [2024-07-25 13:26:18.870077] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.190 13:26:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:38.450 13:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.450 "name": "Existed_Raid", 00:17:38.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.450 "strip_size_kb": 64, 00:17:38.450 "state": "configuring", 00:17:38.450 "raid_level": "raid0", 00:17:38.450 "superblock": false, 00:17:38.450 "num_base_bdevs": 4, 00:17:38.450 "num_base_bdevs_discovered": 2, 00:17:38.450 "num_base_bdevs_operational": 4, 00:17:38.450 "base_bdevs_list": [ 00:17:38.450 { 00:17:38.450 "name": "BaseBdev1", 00:17:38.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:38.450 "is_configured": false, 00:17:38.451 "data_offset": 0, 00:17:38.451 "data_size": 0 00:17:38.451 }, 00:17:38.451 { 00:17:38.451 "name": null, 00:17:38.451 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:38.451 "is_configured": false, 00:17:38.451 "data_offset": 0, 00:17:38.451 "data_size": 65536 00:17:38.451 }, 00:17:38.451 { 00:17:38.451 "name": "BaseBdev3", 00:17:38.451 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:38.451 "is_configured": true, 00:17:38.451 "data_offset": 0, 00:17:38.451 "data_size": 65536 00:17:38.451 }, 00:17:38.451 { 00:17:38.451 "name": "BaseBdev4", 00:17:38.451 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:38.451 "is_configured": true, 00:17:38.451 "data_offset": 0, 00:17:38.451 "data_size": 65536 00:17:38.451 } 00:17:38.451 ] 00:17:38.451 }' 00:17:38.451 13:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.451 13:26:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.020 13:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.020 13:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:39.281 13:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:39.281 13:26:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:39.281 [2024-07-25 13:26:20.026088] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:39.281 BaseBdev1 00:17:39.281 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:39.281 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:39.281 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:39.281 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:39.281 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:39.281 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:39.281 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:39.540 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:39.799 [ 00:17:39.799 { 00:17:39.799 "name": "BaseBdev1", 00:17:39.799 "aliases": [ 00:17:39.799 "48713caa-e34c-40b3-a7dd-cbee693405fa" 00:17:39.799 ], 00:17:39.799 "product_name": "Malloc disk", 00:17:39.799 "block_size": 512, 00:17:39.799 "num_blocks": 65536, 00:17:39.799 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:39.799 "assigned_rate_limits": { 00:17:39.799 "rw_ios_per_sec": 0, 00:17:39.799 "rw_mbytes_per_sec": 0, 00:17:39.799 "r_mbytes_per_sec": 0, 00:17:39.799 "w_mbytes_per_sec": 0 00:17:39.799 }, 00:17:39.799 "claimed": true, 00:17:39.799 "claim_type": "exclusive_write", 00:17:39.799 "zoned": false, 00:17:39.799 "supported_io_types": { 00:17:39.799 "read": true, 00:17:39.799 "write": true, 00:17:39.799 "unmap": true, 00:17:39.799 "flush": true, 00:17:39.799 "reset": true, 00:17:39.799 "nvme_admin": false, 00:17:39.799 "nvme_io": false, 00:17:39.799 "nvme_io_md": false, 00:17:39.799 "write_zeroes": true, 00:17:39.799 "zcopy": true, 00:17:39.799 "get_zone_info": false, 00:17:39.799 "zone_management": false, 00:17:39.799 "zone_append": false, 00:17:39.799 "compare": false, 00:17:39.799 "compare_and_write": false, 00:17:39.799 "abort": true, 00:17:39.799 "seek_hole": false, 00:17:39.799 "seek_data": false, 00:17:39.799 "copy": true, 00:17:39.799 "nvme_iov_md": false 00:17:39.799 }, 00:17:39.799 "memory_domains": [ 00:17:39.799 { 00:17:39.799 "dma_device_id": "system", 00:17:39.799 "dma_device_type": 1 00:17:39.799 }, 00:17:39.799 { 00:17:39.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:39.799 "dma_device_type": 2 00:17:39.799 } 00:17:39.799 ], 00:17:39.799 "driver_specific": {} 00:17:39.799 } 00:17:39.799 ] 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.799 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:40.058 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.058 "name": "Existed_Raid", 00:17:40.058 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:40.058 "strip_size_kb": 64, 00:17:40.058 "state": "configuring", 00:17:40.058 "raid_level": "raid0", 00:17:40.058 "superblock": false, 00:17:40.058 "num_base_bdevs": 4, 00:17:40.058 "num_base_bdevs_discovered": 3, 00:17:40.058 "num_base_bdevs_operational": 4, 00:17:40.058 "base_bdevs_list": [ 00:17:40.058 { 00:17:40.058 "name": "BaseBdev1", 00:17:40.058 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:40.058 "is_configured": true, 00:17:40.058 "data_offset": 0, 00:17:40.058 "data_size": 65536 00:17:40.058 }, 00:17:40.058 { 00:17:40.058 "name": null, 00:17:40.058 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:40.058 "is_configured": false, 00:17:40.058 "data_offset": 0, 00:17:40.058 "data_size": 65536 00:17:40.058 }, 00:17:40.058 { 00:17:40.058 "name": "BaseBdev3", 00:17:40.058 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:40.058 "is_configured": true, 00:17:40.058 "data_offset": 0, 00:17:40.058 "data_size": 65536 00:17:40.058 }, 00:17:40.058 { 00:17:40.058 "name": "BaseBdev4", 00:17:40.058 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:40.058 "is_configured": true, 00:17:40.058 "data_offset": 0, 00:17:40.058 "data_size": 65536 00:17:40.058 } 00:17:40.058 ] 00:17:40.058 }' 00:17:40.058 13:26:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.058 13:26:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.628 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.628 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:40.628 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:40.628 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:40.888 [2024-07-25 13:26:21.489809] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:40.888 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:41.148 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:41.148 "name": "Existed_Raid", 00:17:41.148 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:41.148 "strip_size_kb": 64, 00:17:41.148 "state": "configuring", 00:17:41.148 "raid_level": "raid0", 00:17:41.148 "superblock": false, 00:17:41.148 "num_base_bdevs": 4, 00:17:41.148 "num_base_bdevs_discovered": 2, 00:17:41.148 "num_base_bdevs_operational": 4, 00:17:41.148 "base_bdevs_list": [ 00:17:41.148 { 00:17:41.148 "name": "BaseBdev1", 00:17:41.148 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:41.148 "is_configured": true, 00:17:41.148 "data_offset": 0, 00:17:41.148 "data_size": 65536 00:17:41.148 }, 00:17:41.148 { 00:17:41.148 "name": null, 00:17:41.148 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:41.148 "is_configured": false, 00:17:41.148 "data_offset": 0, 00:17:41.148 "data_size": 65536 00:17:41.148 }, 00:17:41.148 { 00:17:41.148 "name": null, 00:17:41.148 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:41.148 "is_configured": false, 00:17:41.148 "data_offset": 0, 00:17:41.148 "data_size": 65536 00:17:41.148 }, 00:17:41.148 { 00:17:41.148 "name": "BaseBdev4", 00:17:41.148 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:41.148 "is_configured": true, 00:17:41.148 "data_offset": 0, 00:17:41.148 "data_size": 65536 00:17:41.148 } 00:17:41.148 ] 00:17:41.148 }' 00:17:41.148 13:26:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:41.148 13:26:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:41.716 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.716 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:41.716 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:41.716 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:41.976 [2024-07-25 13:26:22.612656] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:41.976 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:42.236 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:42.236 "name": "Existed_Raid", 00:17:42.236 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:42.236 "strip_size_kb": 64, 00:17:42.236 "state": "configuring", 00:17:42.236 "raid_level": "raid0", 00:17:42.236 "superblock": false, 00:17:42.236 "num_base_bdevs": 4, 00:17:42.236 "num_base_bdevs_discovered": 3, 00:17:42.236 "num_base_bdevs_operational": 4, 00:17:42.236 "base_bdevs_list": [ 00:17:42.236 { 00:17:42.236 "name": "BaseBdev1", 00:17:42.236 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:42.236 "is_configured": true, 00:17:42.236 "data_offset": 0, 00:17:42.236 "data_size": 65536 00:17:42.236 }, 00:17:42.236 { 00:17:42.236 "name": null, 00:17:42.236 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:42.236 "is_configured": false, 00:17:42.236 "data_offset": 0, 00:17:42.236 "data_size": 65536 00:17:42.236 }, 00:17:42.236 { 00:17:42.236 "name": "BaseBdev3", 00:17:42.236 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:42.236 "is_configured": true, 00:17:42.236 "data_offset": 0, 00:17:42.236 "data_size": 65536 00:17:42.236 }, 00:17:42.236 { 00:17:42.236 "name": "BaseBdev4", 00:17:42.236 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:42.236 "is_configured": true, 00:17:42.236 "data_offset": 0, 00:17:42.236 "data_size": 65536 00:17:42.236 } 00:17:42.236 ] 00:17:42.236 }' 00:17:42.236 13:26:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:42.236 13:26:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:42.805 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:42.805 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:42.805 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:42.805 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:43.064 [2024-07-25 13:26:23.743520] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.064 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:43.324 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:43.324 "name": "Existed_Raid", 00:17:43.324 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:43.324 "strip_size_kb": 64, 00:17:43.324 "state": "configuring", 00:17:43.324 "raid_level": "raid0", 00:17:43.324 "superblock": false, 00:17:43.324 "num_base_bdevs": 4, 00:17:43.324 "num_base_bdevs_discovered": 2, 00:17:43.324 "num_base_bdevs_operational": 4, 00:17:43.324 "base_bdevs_list": [ 00:17:43.324 { 00:17:43.324 "name": null, 00:17:43.324 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:43.324 "is_configured": false, 00:17:43.324 "data_offset": 0, 00:17:43.324 "data_size": 65536 00:17:43.324 }, 00:17:43.324 { 00:17:43.324 "name": null, 00:17:43.324 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:43.324 "is_configured": false, 00:17:43.324 "data_offset": 0, 00:17:43.324 "data_size": 65536 00:17:43.324 }, 00:17:43.324 { 00:17:43.324 "name": "BaseBdev3", 00:17:43.324 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:43.324 "is_configured": true, 00:17:43.324 "data_offset": 0, 00:17:43.324 "data_size": 65536 00:17:43.324 }, 00:17:43.324 { 00:17:43.324 "name": "BaseBdev4", 00:17:43.324 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:43.324 "is_configured": true, 00:17:43.324 "data_offset": 0, 00:17:43.324 "data_size": 65536 00:17:43.324 } 00:17:43.324 ] 00:17:43.324 }' 00:17:43.324 13:26:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:43.324 13:26:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.892 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:43.892 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:44.152 [2024-07-25 13:26:24.872251] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.152 13:26:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:44.411 13:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:44.411 "name": "Existed_Raid", 00:17:44.411 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:44.411 "strip_size_kb": 64, 00:17:44.411 "state": "configuring", 00:17:44.411 "raid_level": "raid0", 00:17:44.411 "superblock": false, 00:17:44.411 "num_base_bdevs": 4, 00:17:44.411 "num_base_bdevs_discovered": 3, 00:17:44.411 "num_base_bdevs_operational": 4, 00:17:44.411 "base_bdevs_list": [ 00:17:44.411 { 00:17:44.411 "name": null, 00:17:44.411 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:44.411 "is_configured": false, 00:17:44.411 "data_offset": 0, 00:17:44.411 "data_size": 65536 00:17:44.411 }, 00:17:44.411 { 00:17:44.411 "name": "BaseBdev2", 00:17:44.411 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:44.411 "is_configured": true, 00:17:44.411 "data_offset": 0, 00:17:44.411 "data_size": 65536 00:17:44.411 }, 00:17:44.411 { 00:17:44.411 "name": "BaseBdev3", 00:17:44.411 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:44.411 "is_configured": true, 00:17:44.411 "data_offset": 0, 00:17:44.411 "data_size": 65536 00:17:44.411 }, 00:17:44.411 { 00:17:44.411 "name": "BaseBdev4", 00:17:44.411 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:44.411 "is_configured": true, 00:17:44.411 "data_offset": 0, 00:17:44.411 "data_size": 65536 00:17:44.411 } 00:17:44.411 ] 00:17:44.411 }' 00:17:44.411 13:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:44.411 13:26:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:44.980 13:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:44.980 13:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:45.240 13:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:45.240 13:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:45.240 13:26:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:45.240 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 48713caa-e34c-40b3-a7dd-cbee693405fa 00:17:45.498 [2024-07-25 13:26:26.208633] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:45.498 [2024-07-25 13:26:26.208659] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xadfff0 00:17:45.498 [2024-07-25 13:26:26.208663] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:17:45.498 [2024-07-25 13:26:26.208818] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xae3170 00:17:45.498 [2024-07-25 13:26:26.208910] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xadfff0 00:17:45.498 [2024-07-25 13:26:26.208915] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xadfff0 00:17:45.498 [2024-07-25 13:26:26.209034] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:45.498 NewBaseBdev 00:17:45.498 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:45.498 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:17:45.498 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:45.498 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:17:45.498 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:45.498 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:45.498 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:45.788 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:46.073 [ 00:17:46.073 { 00:17:46.073 "name": "NewBaseBdev", 00:17:46.073 "aliases": [ 00:17:46.073 "48713caa-e34c-40b3-a7dd-cbee693405fa" 00:17:46.073 ], 00:17:46.073 "product_name": "Malloc disk", 00:17:46.073 "block_size": 512, 00:17:46.073 "num_blocks": 65536, 00:17:46.073 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:46.073 "assigned_rate_limits": { 00:17:46.073 "rw_ios_per_sec": 0, 00:17:46.073 "rw_mbytes_per_sec": 0, 00:17:46.073 "r_mbytes_per_sec": 0, 00:17:46.073 "w_mbytes_per_sec": 0 00:17:46.073 }, 00:17:46.073 "claimed": true, 00:17:46.073 "claim_type": "exclusive_write", 00:17:46.073 "zoned": false, 00:17:46.073 "supported_io_types": { 00:17:46.073 "read": true, 00:17:46.073 "write": true, 00:17:46.073 "unmap": true, 00:17:46.073 "flush": true, 00:17:46.073 "reset": true, 00:17:46.073 "nvme_admin": false, 00:17:46.073 "nvme_io": false, 00:17:46.073 "nvme_io_md": false, 00:17:46.073 "write_zeroes": true, 00:17:46.073 "zcopy": true, 00:17:46.073 "get_zone_info": false, 00:17:46.073 "zone_management": false, 00:17:46.073 "zone_append": false, 00:17:46.073 "compare": false, 00:17:46.073 "compare_and_write": false, 00:17:46.073 "abort": true, 00:17:46.073 "seek_hole": false, 00:17:46.073 "seek_data": false, 00:17:46.073 "copy": true, 00:17:46.073 "nvme_iov_md": false 00:17:46.073 }, 00:17:46.073 "memory_domains": [ 00:17:46.073 { 00:17:46.073 "dma_device_id": "system", 00:17:46.073 "dma_device_type": 1 00:17:46.073 }, 00:17:46.073 { 00:17:46.073 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.073 "dma_device_type": 2 00:17:46.073 } 00:17:46.073 ], 00:17:46.073 "driver_specific": {} 00:17:46.073 } 00:17:46.073 ] 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:46.073 "name": "Existed_Raid", 00:17:46.073 "uuid": "80d13c02-b65d-40b9-a6d7-f4d813f3ec78", 00:17:46.073 "strip_size_kb": 64, 00:17:46.073 "state": "online", 00:17:46.073 "raid_level": "raid0", 00:17:46.073 "superblock": false, 00:17:46.073 "num_base_bdevs": 4, 00:17:46.073 "num_base_bdevs_discovered": 4, 00:17:46.073 "num_base_bdevs_operational": 4, 00:17:46.073 "base_bdevs_list": [ 00:17:46.073 { 00:17:46.073 "name": "NewBaseBdev", 00:17:46.073 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:46.073 "is_configured": true, 00:17:46.073 "data_offset": 0, 00:17:46.073 "data_size": 65536 00:17:46.073 }, 00:17:46.073 { 00:17:46.073 "name": "BaseBdev2", 00:17:46.073 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:46.073 "is_configured": true, 00:17:46.073 "data_offset": 0, 00:17:46.073 "data_size": 65536 00:17:46.073 }, 00:17:46.073 { 00:17:46.073 "name": "BaseBdev3", 00:17:46.073 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:46.073 "is_configured": true, 00:17:46.073 "data_offset": 0, 00:17:46.073 "data_size": 65536 00:17:46.073 }, 00:17:46.073 { 00:17:46.073 "name": "BaseBdev4", 00:17:46.073 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:46.073 "is_configured": true, 00:17:46.073 "data_offset": 0, 00:17:46.073 "data_size": 65536 00:17:46.073 } 00:17:46.073 ] 00:17:46.073 }' 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:46.073 13:26:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:46.642 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:46.902 [2024-07-25 13:26:27.504176] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:46.902 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:46.902 "name": "Existed_Raid", 00:17:46.902 "aliases": [ 00:17:46.902 "80d13c02-b65d-40b9-a6d7-f4d813f3ec78" 00:17:46.902 ], 00:17:46.902 "product_name": "Raid Volume", 00:17:46.902 "block_size": 512, 00:17:46.902 "num_blocks": 262144, 00:17:46.902 "uuid": "80d13c02-b65d-40b9-a6d7-f4d813f3ec78", 00:17:46.902 "assigned_rate_limits": { 00:17:46.902 "rw_ios_per_sec": 0, 00:17:46.902 "rw_mbytes_per_sec": 0, 00:17:46.902 "r_mbytes_per_sec": 0, 00:17:46.902 "w_mbytes_per_sec": 0 00:17:46.902 }, 00:17:46.902 "claimed": false, 00:17:46.902 "zoned": false, 00:17:46.902 "supported_io_types": { 00:17:46.902 "read": true, 00:17:46.902 "write": true, 00:17:46.902 "unmap": true, 00:17:46.902 "flush": true, 00:17:46.902 "reset": true, 00:17:46.902 "nvme_admin": false, 00:17:46.902 "nvme_io": false, 00:17:46.902 "nvme_io_md": false, 00:17:46.902 "write_zeroes": true, 00:17:46.902 "zcopy": false, 00:17:46.902 "get_zone_info": false, 00:17:46.902 "zone_management": false, 00:17:46.902 "zone_append": false, 00:17:46.902 "compare": false, 00:17:46.902 "compare_and_write": false, 00:17:46.902 "abort": false, 00:17:46.902 "seek_hole": false, 00:17:46.902 "seek_data": false, 00:17:46.902 "copy": false, 00:17:46.902 "nvme_iov_md": false 00:17:46.902 }, 00:17:46.902 "memory_domains": [ 00:17:46.902 { 00:17:46.902 "dma_device_id": "system", 00:17:46.902 "dma_device_type": 1 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.902 "dma_device_type": 2 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "dma_device_id": "system", 00:17:46.902 "dma_device_type": 1 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.902 "dma_device_type": 2 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "dma_device_id": "system", 00:17:46.902 "dma_device_type": 1 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.902 "dma_device_type": 2 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "dma_device_id": "system", 00:17:46.902 "dma_device_type": 1 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:46.902 "dma_device_type": 2 00:17:46.902 } 00:17:46.902 ], 00:17:46.902 "driver_specific": { 00:17:46.902 "raid": { 00:17:46.902 "uuid": "80d13c02-b65d-40b9-a6d7-f4d813f3ec78", 00:17:46.902 "strip_size_kb": 64, 00:17:46.902 "state": "online", 00:17:46.902 "raid_level": "raid0", 00:17:46.902 "superblock": false, 00:17:46.902 "num_base_bdevs": 4, 00:17:46.902 "num_base_bdevs_discovered": 4, 00:17:46.902 "num_base_bdevs_operational": 4, 00:17:46.902 "base_bdevs_list": [ 00:17:46.902 { 00:17:46.902 "name": "NewBaseBdev", 00:17:46.902 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:46.902 "is_configured": true, 00:17:46.902 "data_offset": 0, 00:17:46.902 "data_size": 65536 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "name": "BaseBdev2", 00:17:46.902 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:46.902 "is_configured": true, 00:17:46.902 "data_offset": 0, 00:17:46.902 "data_size": 65536 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "name": "BaseBdev3", 00:17:46.902 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:46.902 "is_configured": true, 00:17:46.902 "data_offset": 0, 00:17:46.902 "data_size": 65536 00:17:46.902 }, 00:17:46.902 { 00:17:46.902 "name": "BaseBdev4", 00:17:46.902 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:46.902 "is_configured": true, 00:17:46.902 "data_offset": 0, 00:17:46.902 "data_size": 65536 00:17:46.902 } 00:17:46.903 ] 00:17:46.903 } 00:17:46.903 } 00:17:46.903 }' 00:17:46.903 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:46.903 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:46.903 BaseBdev2 00:17:46.903 BaseBdev3 00:17:46.903 BaseBdev4' 00:17:46.903 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:46.903 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:46.903 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.162 "name": "NewBaseBdev", 00:17:47.162 "aliases": [ 00:17:47.162 "48713caa-e34c-40b3-a7dd-cbee693405fa" 00:17:47.162 ], 00:17:47.162 "product_name": "Malloc disk", 00:17:47.162 "block_size": 512, 00:17:47.162 "num_blocks": 65536, 00:17:47.162 "uuid": "48713caa-e34c-40b3-a7dd-cbee693405fa", 00:17:47.162 "assigned_rate_limits": { 00:17:47.162 "rw_ios_per_sec": 0, 00:17:47.162 "rw_mbytes_per_sec": 0, 00:17:47.162 "r_mbytes_per_sec": 0, 00:17:47.162 "w_mbytes_per_sec": 0 00:17:47.162 }, 00:17:47.162 "claimed": true, 00:17:47.162 "claim_type": "exclusive_write", 00:17:47.162 "zoned": false, 00:17:47.162 "supported_io_types": { 00:17:47.162 "read": true, 00:17:47.162 "write": true, 00:17:47.162 "unmap": true, 00:17:47.162 "flush": true, 00:17:47.162 "reset": true, 00:17:47.162 "nvme_admin": false, 00:17:47.162 "nvme_io": false, 00:17:47.162 "nvme_io_md": false, 00:17:47.162 "write_zeroes": true, 00:17:47.162 "zcopy": true, 00:17:47.162 "get_zone_info": false, 00:17:47.162 "zone_management": false, 00:17:47.162 "zone_append": false, 00:17:47.162 "compare": false, 00:17:47.162 "compare_and_write": false, 00:17:47.162 "abort": true, 00:17:47.162 "seek_hole": false, 00:17:47.162 "seek_data": false, 00:17:47.162 "copy": true, 00:17:47.162 "nvme_iov_md": false 00:17:47.162 }, 00:17:47.162 "memory_domains": [ 00:17:47.162 { 00:17:47.162 "dma_device_id": "system", 00:17:47.162 "dma_device_type": 1 00:17:47.162 }, 00:17:47.162 { 00:17:47.162 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.162 "dma_device_type": 2 00:17:47.162 } 00:17:47.162 ], 00:17:47.162 "driver_specific": {} 00:17:47.162 }' 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.162 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.422 13:26:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.422 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.422 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.422 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.422 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.422 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.422 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:47.422 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.682 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:47.682 "name": "BaseBdev2", 00:17:47.682 "aliases": [ 00:17:47.682 "a0879f39-a081-451c-8a88-32bb3bfa2b66" 00:17:47.682 ], 00:17:47.682 "product_name": "Malloc disk", 00:17:47.682 "block_size": 512, 00:17:47.682 "num_blocks": 65536, 00:17:47.682 "uuid": "a0879f39-a081-451c-8a88-32bb3bfa2b66", 00:17:47.682 "assigned_rate_limits": { 00:17:47.682 "rw_ios_per_sec": 0, 00:17:47.682 "rw_mbytes_per_sec": 0, 00:17:47.682 "r_mbytes_per_sec": 0, 00:17:47.682 "w_mbytes_per_sec": 0 00:17:47.682 }, 00:17:47.682 "claimed": true, 00:17:47.682 "claim_type": "exclusive_write", 00:17:47.682 "zoned": false, 00:17:47.682 "supported_io_types": { 00:17:47.682 "read": true, 00:17:47.682 "write": true, 00:17:47.682 "unmap": true, 00:17:47.682 "flush": true, 00:17:47.682 "reset": true, 00:17:47.682 "nvme_admin": false, 00:17:47.682 "nvme_io": false, 00:17:47.682 "nvme_io_md": false, 00:17:47.682 "write_zeroes": true, 00:17:47.682 "zcopy": true, 00:17:47.682 "get_zone_info": false, 00:17:47.682 "zone_management": false, 00:17:47.682 "zone_append": false, 00:17:47.682 "compare": false, 00:17:47.682 "compare_and_write": false, 00:17:47.682 "abort": true, 00:17:47.682 "seek_hole": false, 00:17:47.682 "seek_data": false, 00:17:47.682 "copy": true, 00:17:47.682 "nvme_iov_md": false 00:17:47.682 }, 00:17:47.682 "memory_domains": [ 00:17:47.682 { 00:17:47.682 "dma_device_id": "system", 00:17:47.682 "dma_device_type": 1 00:17:47.682 }, 00:17:47.682 { 00:17:47.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:47.682 "dma_device_type": 2 00:17:47.682 } 00:17:47.682 ], 00:17:47.682 "driver_specific": {} 00:17:47.682 }' 00:17:47.682 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.682 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:47.682 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:47.682 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.682 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:47.941 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:48.201 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.201 "name": "BaseBdev3", 00:17:48.201 "aliases": [ 00:17:48.201 "e7ac7497-f460-4b9f-b161-bb719da898ea" 00:17:48.201 ], 00:17:48.201 "product_name": "Malloc disk", 00:17:48.201 "block_size": 512, 00:17:48.201 "num_blocks": 65536, 00:17:48.201 "uuid": "e7ac7497-f460-4b9f-b161-bb719da898ea", 00:17:48.201 "assigned_rate_limits": { 00:17:48.201 "rw_ios_per_sec": 0, 00:17:48.201 "rw_mbytes_per_sec": 0, 00:17:48.201 "r_mbytes_per_sec": 0, 00:17:48.201 "w_mbytes_per_sec": 0 00:17:48.201 }, 00:17:48.201 "claimed": true, 00:17:48.201 "claim_type": "exclusive_write", 00:17:48.201 "zoned": false, 00:17:48.201 "supported_io_types": { 00:17:48.201 "read": true, 00:17:48.201 "write": true, 00:17:48.201 "unmap": true, 00:17:48.201 "flush": true, 00:17:48.201 "reset": true, 00:17:48.201 "nvme_admin": false, 00:17:48.201 "nvme_io": false, 00:17:48.201 "nvme_io_md": false, 00:17:48.201 "write_zeroes": true, 00:17:48.201 "zcopy": true, 00:17:48.201 "get_zone_info": false, 00:17:48.201 "zone_management": false, 00:17:48.201 "zone_append": false, 00:17:48.201 "compare": false, 00:17:48.201 "compare_and_write": false, 00:17:48.201 "abort": true, 00:17:48.201 "seek_hole": false, 00:17:48.201 "seek_data": false, 00:17:48.201 "copy": true, 00:17:48.201 "nvme_iov_md": false 00:17:48.201 }, 00:17:48.201 "memory_domains": [ 00:17:48.201 { 00:17:48.201 "dma_device_id": "system", 00:17:48.201 "dma_device_type": 1 00:17:48.201 }, 00:17:48.201 { 00:17:48.201 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.201 "dma_device_type": 2 00:17:48.201 } 00:17:48.201 ], 00:17:48.201 "driver_specific": {} 00:17:48.201 }' 00:17:48.201 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.201 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.201 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.201 13:26:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:48.461 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:48.721 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:48.721 "name": "BaseBdev4", 00:17:48.721 "aliases": [ 00:17:48.721 "8474296d-8327-41cf-b13a-c88c7c116c62" 00:17:48.721 ], 00:17:48.721 "product_name": "Malloc disk", 00:17:48.721 "block_size": 512, 00:17:48.721 "num_blocks": 65536, 00:17:48.721 "uuid": "8474296d-8327-41cf-b13a-c88c7c116c62", 00:17:48.721 "assigned_rate_limits": { 00:17:48.721 "rw_ios_per_sec": 0, 00:17:48.721 "rw_mbytes_per_sec": 0, 00:17:48.721 "r_mbytes_per_sec": 0, 00:17:48.721 "w_mbytes_per_sec": 0 00:17:48.721 }, 00:17:48.721 "claimed": true, 00:17:48.721 "claim_type": "exclusive_write", 00:17:48.721 "zoned": false, 00:17:48.721 "supported_io_types": { 00:17:48.721 "read": true, 00:17:48.721 "write": true, 00:17:48.721 "unmap": true, 00:17:48.721 "flush": true, 00:17:48.721 "reset": true, 00:17:48.721 "nvme_admin": false, 00:17:48.721 "nvme_io": false, 00:17:48.721 "nvme_io_md": false, 00:17:48.721 "write_zeroes": true, 00:17:48.721 "zcopy": true, 00:17:48.721 "get_zone_info": false, 00:17:48.721 "zone_management": false, 00:17:48.721 "zone_append": false, 00:17:48.721 "compare": false, 00:17:48.721 "compare_and_write": false, 00:17:48.721 "abort": true, 00:17:48.721 "seek_hole": false, 00:17:48.721 "seek_data": false, 00:17:48.721 "copy": true, 00:17:48.721 "nvme_iov_md": false 00:17:48.721 }, 00:17:48.721 "memory_domains": [ 00:17:48.721 { 00:17:48.721 "dma_device_id": "system", 00:17:48.721 "dma_device_type": 1 00:17:48.721 }, 00:17:48.721 { 00:17:48.721 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:48.721 "dma_device_type": 2 00:17:48.721 } 00:17:48.721 ], 00:17:48.721 "driver_specific": {} 00:17:48.721 }' 00:17:48.721 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.721 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:48.980 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:49.240 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:49.240 13:26:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:49.240 [2024-07-25 13:26:29.990224] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:49.240 [2024-07-25 13:26:29.990246] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:49.240 [2024-07-25 13:26:29.990288] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:49.240 [2024-07-25 13:26:29.990333] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:49.240 [2024-07-25 13:26:29.990339] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xadfff0 name Existed_Raid, state offline 00:17:49.240 13:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 941140 00:17:49.240 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 941140 ']' 00:17:49.240 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 941140 00:17:49.240 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:17:49.240 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:17:49.240 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 941140 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 941140' 00:17:49.499 killing process with pid 941140 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 941140 00:17:49.499 [2024-07-25 13:26:30.057982] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 941140 00:17:49.499 [2024-07-25 13:26:30.078437] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:17:49.499 00:17:49.499 real 0m27.388s 00:17:49.499 user 0m51.299s 00:17:49.499 sys 0m4.040s 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:49.499 ************************************ 00:17:49.499 END TEST raid_state_function_test 00:17:49.499 ************************************ 00:17:49.499 13:26:30 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:17:49.499 13:26:30 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:17:49.499 13:26:30 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:17:49.499 13:26:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:49.499 ************************************ 00:17:49.499 START TEST raid_state_function_test_sb 00:17:49.499 ************************************ 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid0 4 true 00:17:49.499 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=946342 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 946342' 00:17:49.500 Process raid pid: 946342 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 946342 /var/tmp/spdk-raid.sock 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 946342 ']' 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:49.500 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:49.500 13:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:17:49.759 13:26:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:49.759 [2024-07-25 13:26:30.340223] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:17:49.760 [2024-07-25 13:26:30.340274] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:49.760 [2024-07-25 13:26:30.434118] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:49.760 [2024-07-25 13:26:30.509411] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:17:50.019 [2024-07-25 13:26:30.562843] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.019 [2024-07-25 13:26:30.562870] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:50.588 13:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:17:50.588 13:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:17:50.588 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:50.588 [2024-07-25 13:26:31.363861] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:50.588 [2024-07-25 13:26:31.363895] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:50.588 [2024-07-25 13:26:31.363901] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:50.588 [2024-07-25 13:26:31.363907] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:50.588 [2024-07-25 13:26:31.363912] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:50.588 [2024-07-25 13:26:31.363917] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:50.588 [2024-07-25 13:26:31.363922] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:50.588 [2024-07-25 13:26:31.363927] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.848 "name": "Existed_Raid", 00:17:50.848 "uuid": "22ed0f15-e1a8-4938-805a-570840a9c154", 00:17:50.848 "strip_size_kb": 64, 00:17:50.848 "state": "configuring", 00:17:50.848 "raid_level": "raid0", 00:17:50.848 "superblock": true, 00:17:50.848 "num_base_bdevs": 4, 00:17:50.848 "num_base_bdevs_discovered": 0, 00:17:50.848 "num_base_bdevs_operational": 4, 00:17:50.848 "base_bdevs_list": [ 00:17:50.848 { 00:17:50.848 "name": "BaseBdev1", 00:17:50.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.848 "is_configured": false, 00:17:50.848 "data_offset": 0, 00:17:50.848 "data_size": 0 00:17:50.848 }, 00:17:50.848 { 00:17:50.848 "name": "BaseBdev2", 00:17:50.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.848 "is_configured": false, 00:17:50.848 "data_offset": 0, 00:17:50.848 "data_size": 0 00:17:50.848 }, 00:17:50.848 { 00:17:50.848 "name": "BaseBdev3", 00:17:50.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.848 "is_configured": false, 00:17:50.848 "data_offset": 0, 00:17:50.848 "data_size": 0 00:17:50.848 }, 00:17:50.848 { 00:17:50.848 "name": "BaseBdev4", 00:17:50.848 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:50.848 "is_configured": false, 00:17:50.848 "data_offset": 0, 00:17:50.848 "data_size": 0 00:17:50.848 } 00:17:50.848 ] 00:17:50.848 }' 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.848 13:26:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:51.417 13:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:51.676 [2024-07-25 13:26:32.286078] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:51.676 [2024-07-25 13:26:32.286096] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11fb6f0 name Existed_Raid, state configuring 00:17:51.676 13:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:51.935 [2024-07-25 13:26:32.474589] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:51.935 [2024-07-25 13:26:32.474608] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:51.935 [2024-07-25 13:26:32.474613] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:51.935 [2024-07-25 13:26:32.474618] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:51.935 [2024-07-25 13:26:32.474623] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:51.935 [2024-07-25 13:26:32.474628] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:51.935 [2024-07-25 13:26:32.474633] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:51.935 [2024-07-25 13:26:32.474639] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:51.935 13:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:51.935 [2024-07-25 13:26:32.669650] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:51.935 BaseBdev1 00:17:51.936 13:26:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:17:51.936 13:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:17:51.936 13:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:51.936 13:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:51.936 13:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:51.936 13:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:51.936 13:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:52.195 13:26:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:52.454 [ 00:17:52.454 { 00:17:52.454 "name": "BaseBdev1", 00:17:52.454 "aliases": [ 00:17:52.454 "5ee52506-672e-4a83-852e-05dde78f1150" 00:17:52.454 ], 00:17:52.454 "product_name": "Malloc disk", 00:17:52.454 "block_size": 512, 00:17:52.454 "num_blocks": 65536, 00:17:52.454 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:52.454 "assigned_rate_limits": { 00:17:52.454 "rw_ios_per_sec": 0, 00:17:52.454 "rw_mbytes_per_sec": 0, 00:17:52.454 "r_mbytes_per_sec": 0, 00:17:52.454 "w_mbytes_per_sec": 0 00:17:52.454 }, 00:17:52.454 "claimed": true, 00:17:52.454 "claim_type": "exclusive_write", 00:17:52.454 "zoned": false, 00:17:52.454 "supported_io_types": { 00:17:52.454 "read": true, 00:17:52.454 "write": true, 00:17:52.454 "unmap": true, 00:17:52.454 "flush": true, 00:17:52.454 "reset": true, 00:17:52.454 "nvme_admin": false, 00:17:52.454 "nvme_io": false, 00:17:52.454 "nvme_io_md": false, 00:17:52.454 "write_zeroes": true, 00:17:52.454 "zcopy": true, 00:17:52.454 "get_zone_info": false, 00:17:52.454 "zone_management": false, 00:17:52.454 "zone_append": false, 00:17:52.454 "compare": false, 00:17:52.454 "compare_and_write": false, 00:17:52.454 "abort": true, 00:17:52.454 "seek_hole": false, 00:17:52.454 "seek_data": false, 00:17:52.454 "copy": true, 00:17:52.454 "nvme_iov_md": false 00:17:52.454 }, 00:17:52.454 "memory_domains": [ 00:17:52.454 { 00:17:52.454 "dma_device_id": "system", 00:17:52.454 "dma_device_type": 1 00:17:52.454 }, 00:17:52.454 { 00:17:52.454 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:52.454 "dma_device_type": 2 00:17:52.454 } 00:17:52.454 ], 00:17:52.454 "driver_specific": {} 00:17:52.454 } 00:17:52.454 ] 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:52.454 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:52.715 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:52.715 "name": "Existed_Raid", 00:17:52.715 "uuid": "20cac662-beb7-4cf3-b58f-189bbe8907e9", 00:17:52.715 "strip_size_kb": 64, 00:17:52.715 "state": "configuring", 00:17:52.715 "raid_level": "raid0", 00:17:52.715 "superblock": true, 00:17:52.715 "num_base_bdevs": 4, 00:17:52.715 "num_base_bdevs_discovered": 1, 00:17:52.715 "num_base_bdevs_operational": 4, 00:17:52.715 "base_bdevs_list": [ 00:17:52.715 { 00:17:52.715 "name": "BaseBdev1", 00:17:52.715 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:52.715 "is_configured": true, 00:17:52.715 "data_offset": 2048, 00:17:52.715 "data_size": 63488 00:17:52.715 }, 00:17:52.715 { 00:17:52.715 "name": "BaseBdev2", 00:17:52.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.715 "is_configured": false, 00:17:52.715 "data_offset": 0, 00:17:52.715 "data_size": 0 00:17:52.715 }, 00:17:52.715 { 00:17:52.715 "name": "BaseBdev3", 00:17:52.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.715 "is_configured": false, 00:17:52.715 "data_offset": 0, 00:17:52.715 "data_size": 0 00:17:52.715 }, 00:17:52.715 { 00:17:52.715 "name": "BaseBdev4", 00:17:52.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:52.715 "is_configured": false, 00:17:52.715 "data_offset": 0, 00:17:52.715 "data_size": 0 00:17:52.715 } 00:17:52.715 ] 00:17:52.715 }' 00:17:52.715 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:52.715 13:26:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:53.284 13:26:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:53.284 [2024-07-25 13:26:33.992995] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:53.284 [2024-07-25 13:26:33.993024] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11faf60 name Existed_Raid, state configuring 00:17:53.284 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:53.544 [2024-07-25 13:26:34.177495] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:53.544 [2024-07-25 13:26:34.178662] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:53.544 [2024-07-25 13:26:34.178687] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:53.544 [2024-07-25 13:26:34.178693] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:53.544 [2024-07-25 13:26:34.178699] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:53.544 [2024-07-25 13:26:34.178704] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:53.544 [2024-07-25 13:26:34.178709] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:53.544 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:53.805 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:53.805 "name": "Existed_Raid", 00:17:53.805 "uuid": "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c", 00:17:53.805 "strip_size_kb": 64, 00:17:53.805 "state": "configuring", 00:17:53.805 "raid_level": "raid0", 00:17:53.805 "superblock": true, 00:17:53.805 "num_base_bdevs": 4, 00:17:53.805 "num_base_bdevs_discovered": 1, 00:17:53.805 "num_base_bdevs_operational": 4, 00:17:53.805 "base_bdevs_list": [ 00:17:53.805 { 00:17:53.805 "name": "BaseBdev1", 00:17:53.805 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:53.805 "is_configured": true, 00:17:53.805 "data_offset": 2048, 00:17:53.805 "data_size": 63488 00:17:53.805 }, 00:17:53.805 { 00:17:53.805 "name": "BaseBdev2", 00:17:53.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.805 "is_configured": false, 00:17:53.805 "data_offset": 0, 00:17:53.805 "data_size": 0 00:17:53.805 }, 00:17:53.805 { 00:17:53.805 "name": "BaseBdev3", 00:17:53.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.805 "is_configured": false, 00:17:53.805 "data_offset": 0, 00:17:53.805 "data_size": 0 00:17:53.805 }, 00:17:53.805 { 00:17:53.805 "name": "BaseBdev4", 00:17:53.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:53.805 "is_configured": false, 00:17:53.805 "data_offset": 0, 00:17:53.805 "data_size": 0 00:17:53.805 } 00:17:53.805 ] 00:17:53.805 }' 00:17:53.805 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:53.805 13:26:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:54.374 13:26:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:54.374 [2024-07-25 13:26:35.132827] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:54.374 BaseBdev2 00:17:54.374 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:54.374 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:17:54.374 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:54.374 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:54.374 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:54.374 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:54.375 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:54.634 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:54.893 [ 00:17:54.893 { 00:17:54.893 "name": "BaseBdev2", 00:17:54.893 "aliases": [ 00:17:54.893 "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42" 00:17:54.893 ], 00:17:54.893 "product_name": "Malloc disk", 00:17:54.893 "block_size": 512, 00:17:54.893 "num_blocks": 65536, 00:17:54.893 "uuid": "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42", 00:17:54.893 "assigned_rate_limits": { 00:17:54.893 "rw_ios_per_sec": 0, 00:17:54.893 "rw_mbytes_per_sec": 0, 00:17:54.893 "r_mbytes_per_sec": 0, 00:17:54.893 "w_mbytes_per_sec": 0 00:17:54.893 }, 00:17:54.893 "claimed": true, 00:17:54.893 "claim_type": "exclusive_write", 00:17:54.893 "zoned": false, 00:17:54.893 "supported_io_types": { 00:17:54.893 "read": true, 00:17:54.893 "write": true, 00:17:54.893 "unmap": true, 00:17:54.893 "flush": true, 00:17:54.893 "reset": true, 00:17:54.893 "nvme_admin": false, 00:17:54.893 "nvme_io": false, 00:17:54.893 "nvme_io_md": false, 00:17:54.893 "write_zeroes": true, 00:17:54.893 "zcopy": true, 00:17:54.893 "get_zone_info": false, 00:17:54.893 "zone_management": false, 00:17:54.893 "zone_append": false, 00:17:54.893 "compare": false, 00:17:54.893 "compare_and_write": false, 00:17:54.893 "abort": true, 00:17:54.893 "seek_hole": false, 00:17:54.893 "seek_data": false, 00:17:54.893 "copy": true, 00:17:54.893 "nvme_iov_md": false 00:17:54.893 }, 00:17:54.893 "memory_domains": [ 00:17:54.893 { 00:17:54.893 "dma_device_id": "system", 00:17:54.893 "dma_device_type": 1 00:17:54.893 }, 00:17:54.893 { 00:17:54.893 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:54.893 "dma_device_type": 2 00:17:54.893 } 00:17:54.893 ], 00:17:54.893 "driver_specific": {} 00:17:54.893 } 00:17:54.893 ] 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:54.893 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:55.153 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.153 "name": "Existed_Raid", 00:17:55.153 "uuid": "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c", 00:17:55.153 "strip_size_kb": 64, 00:17:55.153 "state": "configuring", 00:17:55.153 "raid_level": "raid0", 00:17:55.153 "superblock": true, 00:17:55.153 "num_base_bdevs": 4, 00:17:55.153 "num_base_bdevs_discovered": 2, 00:17:55.153 "num_base_bdevs_operational": 4, 00:17:55.153 "base_bdevs_list": [ 00:17:55.153 { 00:17:55.153 "name": "BaseBdev1", 00:17:55.153 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:55.153 "is_configured": true, 00:17:55.153 "data_offset": 2048, 00:17:55.153 "data_size": 63488 00:17:55.153 }, 00:17:55.153 { 00:17:55.153 "name": "BaseBdev2", 00:17:55.153 "uuid": "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42", 00:17:55.153 "is_configured": true, 00:17:55.153 "data_offset": 2048, 00:17:55.153 "data_size": 63488 00:17:55.153 }, 00:17:55.153 { 00:17:55.153 "name": "BaseBdev3", 00:17:55.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.153 "is_configured": false, 00:17:55.153 "data_offset": 0, 00:17:55.153 "data_size": 0 00:17:55.153 }, 00:17:55.153 { 00:17:55.153 "name": "BaseBdev4", 00:17:55.153 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:55.153 "is_configured": false, 00:17:55.153 "data_offset": 0, 00:17:55.153 "data_size": 0 00:17:55.153 } 00:17:55.153 ] 00:17:55.153 }' 00:17:55.153 13:26:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.153 13:26:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:55.724 [2024-07-25 13:26:36.449218] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:55.724 BaseBdev3 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:55.724 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:55.983 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:56.242 [ 00:17:56.242 { 00:17:56.242 "name": "BaseBdev3", 00:17:56.242 "aliases": [ 00:17:56.243 "d138ddbe-565d-4997-96ad-76835ae8eff0" 00:17:56.243 ], 00:17:56.243 "product_name": "Malloc disk", 00:17:56.243 "block_size": 512, 00:17:56.243 "num_blocks": 65536, 00:17:56.243 "uuid": "d138ddbe-565d-4997-96ad-76835ae8eff0", 00:17:56.243 "assigned_rate_limits": { 00:17:56.243 "rw_ios_per_sec": 0, 00:17:56.243 "rw_mbytes_per_sec": 0, 00:17:56.243 "r_mbytes_per_sec": 0, 00:17:56.243 "w_mbytes_per_sec": 0 00:17:56.243 }, 00:17:56.243 "claimed": true, 00:17:56.243 "claim_type": "exclusive_write", 00:17:56.243 "zoned": false, 00:17:56.243 "supported_io_types": { 00:17:56.243 "read": true, 00:17:56.243 "write": true, 00:17:56.243 "unmap": true, 00:17:56.243 "flush": true, 00:17:56.243 "reset": true, 00:17:56.243 "nvme_admin": false, 00:17:56.243 "nvme_io": false, 00:17:56.243 "nvme_io_md": false, 00:17:56.243 "write_zeroes": true, 00:17:56.243 "zcopy": true, 00:17:56.243 "get_zone_info": false, 00:17:56.243 "zone_management": false, 00:17:56.243 "zone_append": false, 00:17:56.243 "compare": false, 00:17:56.243 "compare_and_write": false, 00:17:56.243 "abort": true, 00:17:56.243 "seek_hole": false, 00:17:56.243 "seek_data": false, 00:17:56.243 "copy": true, 00:17:56.243 "nvme_iov_md": false 00:17:56.243 }, 00:17:56.243 "memory_domains": [ 00:17:56.243 { 00:17:56.243 "dma_device_id": "system", 00:17:56.243 "dma_device_type": 1 00:17:56.243 }, 00:17:56.243 { 00:17:56.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:56.243 "dma_device_type": 2 00:17:56.243 } 00:17:56.243 ], 00:17:56.243 "driver_specific": {} 00:17:56.243 } 00:17:56.243 ] 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:56.243 13:26:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:56.243 13:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:56.243 "name": "Existed_Raid", 00:17:56.243 "uuid": "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c", 00:17:56.243 "strip_size_kb": 64, 00:17:56.243 "state": "configuring", 00:17:56.243 "raid_level": "raid0", 00:17:56.243 "superblock": true, 00:17:56.243 "num_base_bdevs": 4, 00:17:56.243 "num_base_bdevs_discovered": 3, 00:17:56.243 "num_base_bdevs_operational": 4, 00:17:56.243 "base_bdevs_list": [ 00:17:56.243 { 00:17:56.243 "name": "BaseBdev1", 00:17:56.243 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:56.243 "is_configured": true, 00:17:56.243 "data_offset": 2048, 00:17:56.243 "data_size": 63488 00:17:56.243 }, 00:17:56.243 { 00:17:56.243 "name": "BaseBdev2", 00:17:56.243 "uuid": "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42", 00:17:56.243 "is_configured": true, 00:17:56.243 "data_offset": 2048, 00:17:56.243 "data_size": 63488 00:17:56.243 }, 00:17:56.243 { 00:17:56.243 "name": "BaseBdev3", 00:17:56.243 "uuid": "d138ddbe-565d-4997-96ad-76835ae8eff0", 00:17:56.243 "is_configured": true, 00:17:56.243 "data_offset": 2048, 00:17:56.243 "data_size": 63488 00:17:56.243 }, 00:17:56.243 { 00:17:56.243 "name": "BaseBdev4", 00:17:56.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:56.243 "is_configured": false, 00:17:56.243 "data_offset": 0, 00:17:56.243 "data_size": 0 00:17:56.243 } 00:17:56.243 ] 00:17:56.243 }' 00:17:56.243 13:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:56.243 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:56.811 13:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:57.071 [2024-07-25 13:26:37.693356] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:57.071 [2024-07-25 13:26:37.693481] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x11fbfd0 00:17:57.071 [2024-07-25 13:26:37.693490] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:57.071 [2024-07-25 13:26:37.693635] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13a08e0 00:17:57.071 [2024-07-25 13:26:37.693732] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11fbfd0 00:17:57.071 [2024-07-25 13:26:37.693741] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11fbfd0 00:17:57.071 [2024-07-25 13:26:37.693813] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:57.071 BaseBdev4 00:17:57.071 13:26:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:57.071 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:17:57.071 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:17:57.071 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:17:57.071 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:17:57.071 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:17:57.071 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:57.331 13:26:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:57.331 [ 00:17:57.331 { 00:17:57.331 "name": "BaseBdev4", 00:17:57.331 "aliases": [ 00:17:57.331 "6dddd3d5-4dfc-4341-be1b-da03c4454f9e" 00:17:57.331 ], 00:17:57.331 "product_name": "Malloc disk", 00:17:57.331 "block_size": 512, 00:17:57.331 "num_blocks": 65536, 00:17:57.331 "uuid": "6dddd3d5-4dfc-4341-be1b-da03c4454f9e", 00:17:57.331 "assigned_rate_limits": { 00:17:57.331 "rw_ios_per_sec": 0, 00:17:57.331 "rw_mbytes_per_sec": 0, 00:17:57.331 "r_mbytes_per_sec": 0, 00:17:57.331 "w_mbytes_per_sec": 0 00:17:57.331 }, 00:17:57.331 "claimed": true, 00:17:57.331 "claim_type": "exclusive_write", 00:17:57.331 "zoned": false, 00:17:57.331 "supported_io_types": { 00:17:57.331 "read": true, 00:17:57.331 "write": true, 00:17:57.331 "unmap": true, 00:17:57.331 "flush": true, 00:17:57.331 "reset": true, 00:17:57.331 "nvme_admin": false, 00:17:57.331 "nvme_io": false, 00:17:57.331 "nvme_io_md": false, 00:17:57.331 "write_zeroes": true, 00:17:57.331 "zcopy": true, 00:17:57.331 "get_zone_info": false, 00:17:57.331 "zone_management": false, 00:17:57.331 "zone_append": false, 00:17:57.331 "compare": false, 00:17:57.331 "compare_and_write": false, 00:17:57.331 "abort": true, 00:17:57.331 "seek_hole": false, 00:17:57.331 "seek_data": false, 00:17:57.331 "copy": true, 00:17:57.331 "nvme_iov_md": false 00:17:57.331 }, 00:17:57.331 "memory_domains": [ 00:17:57.331 { 00:17:57.331 "dma_device_id": "system", 00:17:57.331 "dma_device_type": 1 00:17:57.331 }, 00:17:57.331 { 00:17:57.331 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:57.331 "dma_device_type": 2 00:17:57.331 } 00:17:57.331 ], 00:17:57.331 "driver_specific": {} 00:17:57.331 } 00:17:57.331 ] 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.331 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:57.590 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.590 "name": "Existed_Raid", 00:17:57.590 "uuid": "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c", 00:17:57.590 "strip_size_kb": 64, 00:17:57.590 "state": "online", 00:17:57.590 "raid_level": "raid0", 00:17:57.590 "superblock": true, 00:17:57.590 "num_base_bdevs": 4, 00:17:57.590 "num_base_bdevs_discovered": 4, 00:17:57.590 "num_base_bdevs_operational": 4, 00:17:57.590 "base_bdevs_list": [ 00:17:57.590 { 00:17:57.590 "name": "BaseBdev1", 00:17:57.590 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:57.590 "is_configured": true, 00:17:57.590 "data_offset": 2048, 00:17:57.590 "data_size": 63488 00:17:57.590 }, 00:17:57.590 { 00:17:57.590 "name": "BaseBdev2", 00:17:57.590 "uuid": "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42", 00:17:57.590 "is_configured": true, 00:17:57.590 "data_offset": 2048, 00:17:57.590 "data_size": 63488 00:17:57.590 }, 00:17:57.590 { 00:17:57.590 "name": "BaseBdev3", 00:17:57.590 "uuid": "d138ddbe-565d-4997-96ad-76835ae8eff0", 00:17:57.590 "is_configured": true, 00:17:57.591 "data_offset": 2048, 00:17:57.591 "data_size": 63488 00:17:57.591 }, 00:17:57.591 { 00:17:57.591 "name": "BaseBdev4", 00:17:57.591 "uuid": "6dddd3d5-4dfc-4341-be1b-da03c4454f9e", 00:17:57.591 "is_configured": true, 00:17:57.591 "data_offset": 2048, 00:17:57.591 "data_size": 63488 00:17:57.591 } 00:17:57.591 ] 00:17:57.591 }' 00:17:57.591 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.591 13:26:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:58.159 13:26:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:58.418 [2024-07-25 13:26:39.024997] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:58.418 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:58.418 "name": "Existed_Raid", 00:17:58.418 "aliases": [ 00:17:58.418 "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c" 00:17:58.418 ], 00:17:58.418 "product_name": "Raid Volume", 00:17:58.418 "block_size": 512, 00:17:58.418 "num_blocks": 253952, 00:17:58.418 "uuid": "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c", 00:17:58.418 "assigned_rate_limits": { 00:17:58.418 "rw_ios_per_sec": 0, 00:17:58.418 "rw_mbytes_per_sec": 0, 00:17:58.418 "r_mbytes_per_sec": 0, 00:17:58.418 "w_mbytes_per_sec": 0 00:17:58.418 }, 00:17:58.418 "claimed": false, 00:17:58.418 "zoned": false, 00:17:58.418 "supported_io_types": { 00:17:58.418 "read": true, 00:17:58.418 "write": true, 00:17:58.418 "unmap": true, 00:17:58.418 "flush": true, 00:17:58.418 "reset": true, 00:17:58.418 "nvme_admin": false, 00:17:58.418 "nvme_io": false, 00:17:58.418 "nvme_io_md": false, 00:17:58.418 "write_zeroes": true, 00:17:58.418 "zcopy": false, 00:17:58.418 "get_zone_info": false, 00:17:58.418 "zone_management": false, 00:17:58.418 "zone_append": false, 00:17:58.418 "compare": false, 00:17:58.418 "compare_and_write": false, 00:17:58.418 "abort": false, 00:17:58.418 "seek_hole": false, 00:17:58.418 "seek_data": false, 00:17:58.418 "copy": false, 00:17:58.418 "nvme_iov_md": false 00:17:58.418 }, 00:17:58.418 "memory_domains": [ 00:17:58.418 { 00:17:58.418 "dma_device_id": "system", 00:17:58.418 "dma_device_type": 1 00:17:58.418 }, 00:17:58.418 { 00:17:58.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.418 "dma_device_type": 2 00:17:58.418 }, 00:17:58.418 { 00:17:58.418 "dma_device_id": "system", 00:17:58.418 "dma_device_type": 1 00:17:58.418 }, 00:17:58.418 { 00:17:58.418 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.418 "dma_device_type": 2 00:17:58.418 }, 00:17:58.418 { 00:17:58.418 "dma_device_id": "system", 00:17:58.418 "dma_device_type": 1 00:17:58.418 }, 00:17:58.419 { 00:17:58.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.419 "dma_device_type": 2 00:17:58.419 }, 00:17:58.419 { 00:17:58.419 "dma_device_id": "system", 00:17:58.419 "dma_device_type": 1 00:17:58.419 }, 00:17:58.419 { 00:17:58.419 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.419 "dma_device_type": 2 00:17:58.419 } 00:17:58.419 ], 00:17:58.419 "driver_specific": { 00:17:58.419 "raid": { 00:17:58.419 "uuid": "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c", 00:17:58.419 "strip_size_kb": 64, 00:17:58.419 "state": "online", 00:17:58.419 "raid_level": "raid0", 00:17:58.419 "superblock": true, 00:17:58.419 "num_base_bdevs": 4, 00:17:58.419 "num_base_bdevs_discovered": 4, 00:17:58.419 "num_base_bdevs_operational": 4, 00:17:58.419 "base_bdevs_list": [ 00:17:58.419 { 00:17:58.419 "name": "BaseBdev1", 00:17:58.419 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:58.419 "is_configured": true, 00:17:58.419 "data_offset": 2048, 00:17:58.419 "data_size": 63488 00:17:58.419 }, 00:17:58.419 { 00:17:58.419 "name": "BaseBdev2", 00:17:58.419 "uuid": "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42", 00:17:58.419 "is_configured": true, 00:17:58.419 "data_offset": 2048, 00:17:58.419 "data_size": 63488 00:17:58.419 }, 00:17:58.419 { 00:17:58.419 "name": "BaseBdev3", 00:17:58.419 "uuid": "d138ddbe-565d-4997-96ad-76835ae8eff0", 00:17:58.419 "is_configured": true, 00:17:58.419 "data_offset": 2048, 00:17:58.419 "data_size": 63488 00:17:58.419 }, 00:17:58.419 { 00:17:58.419 "name": "BaseBdev4", 00:17:58.419 "uuid": "6dddd3d5-4dfc-4341-be1b-da03c4454f9e", 00:17:58.419 "is_configured": true, 00:17:58.419 "data_offset": 2048, 00:17:58.419 "data_size": 63488 00:17:58.419 } 00:17:58.419 ] 00:17:58.419 } 00:17:58.419 } 00:17:58.419 }' 00:17:58.419 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:58.419 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:58.419 BaseBdev2 00:17:58.419 BaseBdev3 00:17:58.419 BaseBdev4' 00:17:58.419 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.419 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:58.419 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:58.678 "name": "BaseBdev1", 00:17:58.678 "aliases": [ 00:17:58.678 "5ee52506-672e-4a83-852e-05dde78f1150" 00:17:58.678 ], 00:17:58.678 "product_name": "Malloc disk", 00:17:58.678 "block_size": 512, 00:17:58.678 "num_blocks": 65536, 00:17:58.678 "uuid": "5ee52506-672e-4a83-852e-05dde78f1150", 00:17:58.678 "assigned_rate_limits": { 00:17:58.678 "rw_ios_per_sec": 0, 00:17:58.678 "rw_mbytes_per_sec": 0, 00:17:58.678 "r_mbytes_per_sec": 0, 00:17:58.678 "w_mbytes_per_sec": 0 00:17:58.678 }, 00:17:58.678 "claimed": true, 00:17:58.678 "claim_type": "exclusive_write", 00:17:58.678 "zoned": false, 00:17:58.678 "supported_io_types": { 00:17:58.678 "read": true, 00:17:58.678 "write": true, 00:17:58.678 "unmap": true, 00:17:58.678 "flush": true, 00:17:58.678 "reset": true, 00:17:58.678 "nvme_admin": false, 00:17:58.678 "nvme_io": false, 00:17:58.678 "nvme_io_md": false, 00:17:58.678 "write_zeroes": true, 00:17:58.678 "zcopy": true, 00:17:58.678 "get_zone_info": false, 00:17:58.678 "zone_management": false, 00:17:58.678 "zone_append": false, 00:17:58.678 "compare": false, 00:17:58.678 "compare_and_write": false, 00:17:58.678 "abort": true, 00:17:58.678 "seek_hole": false, 00:17:58.678 "seek_data": false, 00:17:58.678 "copy": true, 00:17:58.678 "nvme_iov_md": false 00:17:58.678 }, 00:17:58.678 "memory_domains": [ 00:17:58.678 { 00:17:58.678 "dma_device_id": "system", 00:17:58.678 "dma_device_type": 1 00:17:58.678 }, 00:17:58.678 { 00:17:58.678 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:58.678 "dma_device_type": 2 00:17:58.678 } 00:17:58.678 ], 00:17:58.678 "driver_specific": {} 00:17:58.678 }' 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:58.678 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:58.939 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.199 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.199 "name": "BaseBdev2", 00:17:59.199 "aliases": [ 00:17:59.199 "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42" 00:17:59.199 ], 00:17:59.199 "product_name": "Malloc disk", 00:17:59.199 "block_size": 512, 00:17:59.199 "num_blocks": 65536, 00:17:59.199 "uuid": "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42", 00:17:59.199 "assigned_rate_limits": { 00:17:59.199 "rw_ios_per_sec": 0, 00:17:59.199 "rw_mbytes_per_sec": 0, 00:17:59.199 "r_mbytes_per_sec": 0, 00:17:59.199 "w_mbytes_per_sec": 0 00:17:59.199 }, 00:17:59.199 "claimed": true, 00:17:59.199 "claim_type": "exclusive_write", 00:17:59.199 "zoned": false, 00:17:59.199 "supported_io_types": { 00:17:59.199 "read": true, 00:17:59.199 "write": true, 00:17:59.199 "unmap": true, 00:17:59.199 "flush": true, 00:17:59.199 "reset": true, 00:17:59.199 "nvme_admin": false, 00:17:59.199 "nvme_io": false, 00:17:59.199 "nvme_io_md": false, 00:17:59.199 "write_zeroes": true, 00:17:59.199 "zcopy": true, 00:17:59.199 "get_zone_info": false, 00:17:59.199 "zone_management": false, 00:17:59.199 "zone_append": false, 00:17:59.199 "compare": false, 00:17:59.199 "compare_and_write": false, 00:17:59.199 "abort": true, 00:17:59.199 "seek_hole": false, 00:17:59.199 "seek_data": false, 00:17:59.199 "copy": true, 00:17:59.199 "nvme_iov_md": false 00:17:59.199 }, 00:17:59.199 "memory_domains": [ 00:17:59.199 { 00:17:59.199 "dma_device_id": "system", 00:17:59.199 "dma_device_type": 1 00:17:59.199 }, 00:17:59.199 { 00:17:59.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.199 "dma_device_type": 2 00:17:59.199 } 00:17:59.199 ], 00:17:59.199 "driver_specific": {} 00:17:59.199 }' 00:17:59.199 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.199 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.199 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.199 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.199 13:26:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:59.459 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:59.718 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:59.718 "name": "BaseBdev3", 00:17:59.718 "aliases": [ 00:17:59.719 "d138ddbe-565d-4997-96ad-76835ae8eff0" 00:17:59.719 ], 00:17:59.719 "product_name": "Malloc disk", 00:17:59.719 "block_size": 512, 00:17:59.719 "num_blocks": 65536, 00:17:59.719 "uuid": "d138ddbe-565d-4997-96ad-76835ae8eff0", 00:17:59.719 "assigned_rate_limits": { 00:17:59.719 "rw_ios_per_sec": 0, 00:17:59.719 "rw_mbytes_per_sec": 0, 00:17:59.719 "r_mbytes_per_sec": 0, 00:17:59.719 "w_mbytes_per_sec": 0 00:17:59.719 }, 00:17:59.719 "claimed": true, 00:17:59.719 "claim_type": "exclusive_write", 00:17:59.719 "zoned": false, 00:17:59.719 "supported_io_types": { 00:17:59.719 "read": true, 00:17:59.719 "write": true, 00:17:59.719 "unmap": true, 00:17:59.719 "flush": true, 00:17:59.719 "reset": true, 00:17:59.719 "nvme_admin": false, 00:17:59.719 "nvme_io": false, 00:17:59.719 "nvme_io_md": false, 00:17:59.719 "write_zeroes": true, 00:17:59.719 "zcopy": true, 00:17:59.719 "get_zone_info": false, 00:17:59.719 "zone_management": false, 00:17:59.719 "zone_append": false, 00:17:59.719 "compare": false, 00:17:59.719 "compare_and_write": false, 00:17:59.719 "abort": true, 00:17:59.719 "seek_hole": false, 00:17:59.719 "seek_data": false, 00:17:59.719 "copy": true, 00:17:59.719 "nvme_iov_md": false 00:17:59.719 }, 00:17:59.719 "memory_domains": [ 00:17:59.719 { 00:17:59.719 "dma_device_id": "system", 00:17:59.719 "dma_device_type": 1 00:17:59.719 }, 00:17:59.719 { 00:17:59.719 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:59.719 "dma_device_type": 2 00:17:59.719 } 00:17:59.719 ], 00:17:59.719 "driver_specific": {} 00:17:59.719 }' 00:17:59.719 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.719 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:59.719 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:59.719 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:59.978 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:00.238 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:00.238 "name": "BaseBdev4", 00:18:00.238 "aliases": [ 00:18:00.238 "6dddd3d5-4dfc-4341-be1b-da03c4454f9e" 00:18:00.238 ], 00:18:00.238 "product_name": "Malloc disk", 00:18:00.238 "block_size": 512, 00:18:00.238 "num_blocks": 65536, 00:18:00.238 "uuid": "6dddd3d5-4dfc-4341-be1b-da03c4454f9e", 00:18:00.238 "assigned_rate_limits": { 00:18:00.238 "rw_ios_per_sec": 0, 00:18:00.238 "rw_mbytes_per_sec": 0, 00:18:00.238 "r_mbytes_per_sec": 0, 00:18:00.238 "w_mbytes_per_sec": 0 00:18:00.238 }, 00:18:00.238 "claimed": true, 00:18:00.238 "claim_type": "exclusive_write", 00:18:00.238 "zoned": false, 00:18:00.238 "supported_io_types": { 00:18:00.238 "read": true, 00:18:00.238 "write": true, 00:18:00.238 "unmap": true, 00:18:00.238 "flush": true, 00:18:00.238 "reset": true, 00:18:00.238 "nvme_admin": false, 00:18:00.238 "nvme_io": false, 00:18:00.238 "nvme_io_md": false, 00:18:00.238 "write_zeroes": true, 00:18:00.238 "zcopy": true, 00:18:00.238 "get_zone_info": false, 00:18:00.238 "zone_management": false, 00:18:00.238 "zone_append": false, 00:18:00.238 "compare": false, 00:18:00.238 "compare_and_write": false, 00:18:00.238 "abort": true, 00:18:00.238 "seek_hole": false, 00:18:00.238 "seek_data": false, 00:18:00.238 "copy": true, 00:18:00.238 "nvme_iov_md": false 00:18:00.238 }, 00:18:00.238 "memory_domains": [ 00:18:00.238 { 00:18:00.238 "dma_device_id": "system", 00:18:00.238 "dma_device_type": 1 00:18:00.238 }, 00:18:00.238 { 00:18:00.238 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:00.238 "dma_device_type": 2 00:18:00.238 } 00:18:00.238 ], 00:18:00.238 "driver_specific": {} 00:18:00.238 }' 00:18:00.238 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.238 13:26:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:00.498 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:00.758 [2024-07-25 13:26:41.443018] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:00.758 [2024-07-25 13:26:41.443040] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:00.758 [2024-07-25 13:26:41.443074] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:00.758 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:01.017 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:01.017 "name": "Existed_Raid", 00:18:01.017 "uuid": "be3c260a-7e72-4a78-8c2b-9e31ac43ba9c", 00:18:01.017 "strip_size_kb": 64, 00:18:01.017 "state": "offline", 00:18:01.017 "raid_level": "raid0", 00:18:01.017 "superblock": true, 00:18:01.017 "num_base_bdevs": 4, 00:18:01.017 "num_base_bdevs_discovered": 3, 00:18:01.017 "num_base_bdevs_operational": 3, 00:18:01.017 "base_bdevs_list": [ 00:18:01.017 { 00:18:01.017 "name": null, 00:18:01.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:01.017 "is_configured": false, 00:18:01.017 "data_offset": 2048, 00:18:01.017 "data_size": 63488 00:18:01.017 }, 00:18:01.017 { 00:18:01.017 "name": "BaseBdev2", 00:18:01.017 "uuid": "b1eb19c2-dc8a-4b9a-9b82-bad6564d4f42", 00:18:01.017 "is_configured": true, 00:18:01.017 "data_offset": 2048, 00:18:01.017 "data_size": 63488 00:18:01.017 }, 00:18:01.018 { 00:18:01.018 "name": "BaseBdev3", 00:18:01.018 "uuid": "d138ddbe-565d-4997-96ad-76835ae8eff0", 00:18:01.018 "is_configured": true, 00:18:01.018 "data_offset": 2048, 00:18:01.018 "data_size": 63488 00:18:01.018 }, 00:18:01.018 { 00:18:01.018 "name": "BaseBdev4", 00:18:01.018 "uuid": "6dddd3d5-4dfc-4341-be1b-da03c4454f9e", 00:18:01.018 "is_configured": true, 00:18:01.018 "data_offset": 2048, 00:18:01.018 "data_size": 63488 00:18:01.018 } 00:18:01.018 ] 00:18:01.018 }' 00:18:01.018 13:26:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:01.018 13:26:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:01.587 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:01.587 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:01.587 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.587 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:01.846 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:01.846 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:01.846 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:01.846 [2024-07-25 13:26:42.593939] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:01.846 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:01.846 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:01.846 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.846 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:02.106 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:02.106 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:02.106 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:02.365 [2024-07-25 13:26:42.984708] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:02.365 13:26:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:02.365 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:02.365 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.366 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:02.626 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:02.626 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:02.626 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:02.626 [2024-07-25 13:26:43.371375] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:02.626 [2024-07-25 13:26:43.371406] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11fbfd0 name Existed_Raid, state offline 00:18:02.626 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:02.626 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:02.626 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:02.626 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:03.197 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:03.197 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:03.197 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:03.197 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:03.197 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:03.197 13:26:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:03.457 BaseBdev2 00:18:03.457 13:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:03.457 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:18:03.457 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:03.457 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:03.457 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:03.457 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:03.457 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:03.717 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:03.717 [ 00:18:03.717 { 00:18:03.717 "name": "BaseBdev2", 00:18:03.717 "aliases": [ 00:18:03.717 "89941acf-3be3-47c8-b6d7-13d89076bc1b" 00:18:03.717 ], 00:18:03.717 "product_name": "Malloc disk", 00:18:03.717 "block_size": 512, 00:18:03.717 "num_blocks": 65536, 00:18:03.717 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:03.717 "assigned_rate_limits": { 00:18:03.717 "rw_ios_per_sec": 0, 00:18:03.717 "rw_mbytes_per_sec": 0, 00:18:03.717 "r_mbytes_per_sec": 0, 00:18:03.717 "w_mbytes_per_sec": 0 00:18:03.717 }, 00:18:03.717 "claimed": false, 00:18:03.717 "zoned": false, 00:18:03.717 "supported_io_types": { 00:18:03.717 "read": true, 00:18:03.717 "write": true, 00:18:03.717 "unmap": true, 00:18:03.717 "flush": true, 00:18:03.717 "reset": true, 00:18:03.717 "nvme_admin": false, 00:18:03.717 "nvme_io": false, 00:18:03.717 "nvme_io_md": false, 00:18:03.717 "write_zeroes": true, 00:18:03.717 "zcopy": true, 00:18:03.717 "get_zone_info": false, 00:18:03.717 "zone_management": false, 00:18:03.717 "zone_append": false, 00:18:03.717 "compare": false, 00:18:03.717 "compare_and_write": false, 00:18:03.717 "abort": true, 00:18:03.717 "seek_hole": false, 00:18:03.717 "seek_data": false, 00:18:03.717 "copy": true, 00:18:03.717 "nvme_iov_md": false 00:18:03.717 }, 00:18:03.717 "memory_domains": [ 00:18:03.717 { 00:18:03.717 "dma_device_id": "system", 00:18:03.717 "dma_device_type": 1 00:18:03.717 }, 00:18:03.717 { 00:18:03.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:03.717 "dma_device_type": 2 00:18:03.717 } 00:18:03.717 ], 00:18:03.717 "driver_specific": {} 00:18:03.717 } 00:18:03.717 ] 00:18:03.717 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:03.717 13:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:03.717 13:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:03.717 13:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:03.976 BaseBdev3 00:18:03.977 13:26:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:03.977 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:18:03.977 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:03.977 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:03.977 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:03.977 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:03.977 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.236 13:26:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:04.496 [ 00:18:04.496 { 00:18:04.496 "name": "BaseBdev3", 00:18:04.496 "aliases": [ 00:18:04.496 "23bd3111-21c1-4df1-830c-0770e75f7938" 00:18:04.496 ], 00:18:04.496 "product_name": "Malloc disk", 00:18:04.496 "block_size": 512, 00:18:04.496 "num_blocks": 65536, 00:18:04.496 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:04.496 "assigned_rate_limits": { 00:18:04.496 "rw_ios_per_sec": 0, 00:18:04.496 "rw_mbytes_per_sec": 0, 00:18:04.496 "r_mbytes_per_sec": 0, 00:18:04.496 "w_mbytes_per_sec": 0 00:18:04.496 }, 00:18:04.496 "claimed": false, 00:18:04.496 "zoned": false, 00:18:04.496 "supported_io_types": { 00:18:04.496 "read": true, 00:18:04.496 "write": true, 00:18:04.496 "unmap": true, 00:18:04.496 "flush": true, 00:18:04.496 "reset": true, 00:18:04.496 "nvme_admin": false, 00:18:04.496 "nvme_io": false, 00:18:04.496 "nvme_io_md": false, 00:18:04.496 "write_zeroes": true, 00:18:04.496 "zcopy": true, 00:18:04.496 "get_zone_info": false, 00:18:04.496 "zone_management": false, 00:18:04.496 "zone_append": false, 00:18:04.496 "compare": false, 00:18:04.496 "compare_and_write": false, 00:18:04.496 "abort": true, 00:18:04.496 "seek_hole": false, 00:18:04.496 "seek_data": false, 00:18:04.496 "copy": true, 00:18:04.496 "nvme_iov_md": false 00:18:04.496 }, 00:18:04.496 "memory_domains": [ 00:18:04.496 { 00:18:04.496 "dma_device_id": "system", 00:18:04.496 "dma_device_type": 1 00:18:04.496 }, 00:18:04.496 { 00:18:04.496 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.496 "dma_device_type": 2 00:18:04.496 } 00:18:04.496 ], 00:18:04.496 "driver_specific": {} 00:18:04.496 } 00:18:04.496 ] 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:04.496 BaseBdev4 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:04.496 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.756 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:05.015 [ 00:18:05.015 { 00:18:05.015 "name": "BaseBdev4", 00:18:05.015 "aliases": [ 00:18:05.015 "27d9c08f-5dc9-4b39-8f7b-37519811fafb" 00:18:05.015 ], 00:18:05.015 "product_name": "Malloc disk", 00:18:05.015 "block_size": 512, 00:18:05.015 "num_blocks": 65536, 00:18:05.015 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:05.015 "assigned_rate_limits": { 00:18:05.015 "rw_ios_per_sec": 0, 00:18:05.015 "rw_mbytes_per_sec": 0, 00:18:05.015 "r_mbytes_per_sec": 0, 00:18:05.015 "w_mbytes_per_sec": 0 00:18:05.016 }, 00:18:05.016 "claimed": false, 00:18:05.016 "zoned": false, 00:18:05.016 "supported_io_types": { 00:18:05.016 "read": true, 00:18:05.016 "write": true, 00:18:05.016 "unmap": true, 00:18:05.016 "flush": true, 00:18:05.016 "reset": true, 00:18:05.016 "nvme_admin": false, 00:18:05.016 "nvme_io": false, 00:18:05.016 "nvme_io_md": false, 00:18:05.016 "write_zeroes": true, 00:18:05.016 "zcopy": true, 00:18:05.016 "get_zone_info": false, 00:18:05.016 "zone_management": false, 00:18:05.016 "zone_append": false, 00:18:05.016 "compare": false, 00:18:05.016 "compare_and_write": false, 00:18:05.016 "abort": true, 00:18:05.016 "seek_hole": false, 00:18:05.016 "seek_data": false, 00:18:05.016 "copy": true, 00:18:05.016 "nvme_iov_md": false 00:18:05.016 }, 00:18:05.016 "memory_domains": [ 00:18:05.016 { 00:18:05.016 "dma_device_id": "system", 00:18:05.016 "dma_device_type": 1 00:18:05.016 }, 00:18:05.016 { 00:18:05.016 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:05.016 "dma_device_type": 2 00:18:05.016 } 00:18:05.016 ], 00:18:05.016 "driver_specific": {} 00:18:05.016 } 00:18:05.016 ] 00:18:05.016 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:05.016 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:05.016 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:05.016 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:05.016 [2024-07-25 13:26:45.803504] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:05.016 [2024-07-25 13:26:45.803537] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:05.016 [2024-07-25 13:26:45.803562] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:05.016 [2024-07-25 13:26:45.804626] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:05.016 [2024-07-25 13:26:45.804660] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:05.275 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:05.276 "name": "Existed_Raid", 00:18:05.276 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:05.276 "strip_size_kb": 64, 00:18:05.276 "state": "configuring", 00:18:05.276 "raid_level": "raid0", 00:18:05.276 "superblock": true, 00:18:05.276 "num_base_bdevs": 4, 00:18:05.276 "num_base_bdevs_discovered": 3, 00:18:05.276 "num_base_bdevs_operational": 4, 00:18:05.276 "base_bdevs_list": [ 00:18:05.276 { 00:18:05.276 "name": "BaseBdev1", 00:18:05.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:05.276 "is_configured": false, 00:18:05.276 "data_offset": 0, 00:18:05.276 "data_size": 0 00:18:05.276 }, 00:18:05.276 { 00:18:05.276 "name": "BaseBdev2", 00:18:05.276 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:05.276 "is_configured": true, 00:18:05.276 "data_offset": 2048, 00:18:05.276 "data_size": 63488 00:18:05.276 }, 00:18:05.276 { 00:18:05.276 "name": "BaseBdev3", 00:18:05.276 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:05.276 "is_configured": true, 00:18:05.276 "data_offset": 2048, 00:18:05.276 "data_size": 63488 00:18:05.276 }, 00:18:05.276 { 00:18:05.276 "name": "BaseBdev4", 00:18:05.276 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:05.276 "is_configured": true, 00:18:05.276 "data_offset": 2048, 00:18:05.276 "data_size": 63488 00:18:05.276 } 00:18:05.276 ] 00:18:05.276 }' 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:05.276 13:26:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:05.881 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:06.175 [2024-07-25 13:26:46.898234] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.175 13:26:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.435 13:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.435 "name": "Existed_Raid", 00:18:06.435 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:06.435 "strip_size_kb": 64, 00:18:06.435 "state": "configuring", 00:18:06.435 "raid_level": "raid0", 00:18:06.435 "superblock": true, 00:18:06.435 "num_base_bdevs": 4, 00:18:06.435 "num_base_bdevs_discovered": 2, 00:18:06.435 "num_base_bdevs_operational": 4, 00:18:06.435 "base_bdevs_list": [ 00:18:06.435 { 00:18:06.435 "name": "BaseBdev1", 00:18:06.435 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.435 "is_configured": false, 00:18:06.435 "data_offset": 0, 00:18:06.435 "data_size": 0 00:18:06.435 }, 00:18:06.435 { 00:18:06.435 "name": null, 00:18:06.435 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:06.435 "is_configured": false, 00:18:06.435 "data_offset": 2048, 00:18:06.435 "data_size": 63488 00:18:06.435 }, 00:18:06.435 { 00:18:06.435 "name": "BaseBdev3", 00:18:06.435 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:06.435 "is_configured": true, 00:18:06.435 "data_offset": 2048, 00:18:06.435 "data_size": 63488 00:18:06.435 }, 00:18:06.435 { 00:18:06.435 "name": "BaseBdev4", 00:18:06.435 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:06.435 "is_configured": true, 00:18:06.435 "data_offset": 2048, 00:18:06.435 "data_size": 63488 00:18:06.435 } 00:18:06.435 ] 00:18:06.435 }' 00:18:06.435 13:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.435 13:26:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:07.374 13:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.374 13:26:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:07.635 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:07.635 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:07.895 [2024-07-25 13:26:48.527389] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:07.895 BaseBdev1 00:18:07.895 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:07.895 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:18:07.895 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:07.895 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:07.895 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:07.895 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:07.895 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:08.155 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:08.155 [ 00:18:08.155 { 00:18:08.155 "name": "BaseBdev1", 00:18:08.155 "aliases": [ 00:18:08.155 "9a292432-b6dc-490c-ae1d-255750554ac2" 00:18:08.155 ], 00:18:08.155 "product_name": "Malloc disk", 00:18:08.155 "block_size": 512, 00:18:08.155 "num_blocks": 65536, 00:18:08.155 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:08.155 "assigned_rate_limits": { 00:18:08.155 "rw_ios_per_sec": 0, 00:18:08.155 "rw_mbytes_per_sec": 0, 00:18:08.155 "r_mbytes_per_sec": 0, 00:18:08.155 "w_mbytes_per_sec": 0 00:18:08.155 }, 00:18:08.155 "claimed": true, 00:18:08.155 "claim_type": "exclusive_write", 00:18:08.155 "zoned": false, 00:18:08.155 "supported_io_types": { 00:18:08.155 "read": true, 00:18:08.155 "write": true, 00:18:08.155 "unmap": true, 00:18:08.155 "flush": true, 00:18:08.155 "reset": true, 00:18:08.155 "nvme_admin": false, 00:18:08.155 "nvme_io": false, 00:18:08.155 "nvme_io_md": false, 00:18:08.155 "write_zeroes": true, 00:18:08.155 "zcopy": true, 00:18:08.155 "get_zone_info": false, 00:18:08.155 "zone_management": false, 00:18:08.155 "zone_append": false, 00:18:08.155 "compare": false, 00:18:08.155 "compare_and_write": false, 00:18:08.155 "abort": true, 00:18:08.155 "seek_hole": false, 00:18:08.155 "seek_data": false, 00:18:08.155 "copy": true, 00:18:08.155 "nvme_iov_md": false 00:18:08.155 }, 00:18:08.156 "memory_domains": [ 00:18:08.156 { 00:18:08.156 "dma_device_id": "system", 00:18:08.156 "dma_device_type": 1 00:18:08.156 }, 00:18:08.156 { 00:18:08.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:08.156 "dma_device_type": 2 00:18:08.156 } 00:18:08.156 ], 00:18:08.156 "driver_specific": {} 00:18:08.156 } 00:18:08.156 ] 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:08.156 13:26:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.415 13:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.415 "name": "Existed_Raid", 00:18:08.415 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:08.415 "strip_size_kb": 64, 00:18:08.415 "state": "configuring", 00:18:08.415 "raid_level": "raid0", 00:18:08.415 "superblock": true, 00:18:08.415 "num_base_bdevs": 4, 00:18:08.415 "num_base_bdevs_discovered": 3, 00:18:08.415 "num_base_bdevs_operational": 4, 00:18:08.415 "base_bdevs_list": [ 00:18:08.415 { 00:18:08.415 "name": "BaseBdev1", 00:18:08.415 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:08.415 "is_configured": true, 00:18:08.415 "data_offset": 2048, 00:18:08.415 "data_size": 63488 00:18:08.415 }, 00:18:08.415 { 00:18:08.415 "name": null, 00:18:08.415 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:08.415 "is_configured": false, 00:18:08.415 "data_offset": 2048, 00:18:08.415 "data_size": 63488 00:18:08.415 }, 00:18:08.415 { 00:18:08.415 "name": "BaseBdev3", 00:18:08.415 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:08.415 "is_configured": true, 00:18:08.415 "data_offset": 2048, 00:18:08.415 "data_size": 63488 00:18:08.415 }, 00:18:08.415 { 00:18:08.415 "name": "BaseBdev4", 00:18:08.415 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:08.415 "is_configured": true, 00:18:08.415 "data_offset": 2048, 00:18:08.415 "data_size": 63488 00:18:08.415 } 00:18:08.415 ] 00:18:08.415 }' 00:18:08.415 13:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.415 13:26:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:09.354 13:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.354 13:26:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:09.354 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:09.354 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:09.923 [2024-07-25 13:26:50.624816] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:09.923 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:10.183 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:10.183 "name": "Existed_Raid", 00:18:10.183 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:10.183 "strip_size_kb": 64, 00:18:10.183 "state": "configuring", 00:18:10.183 "raid_level": "raid0", 00:18:10.183 "superblock": true, 00:18:10.183 "num_base_bdevs": 4, 00:18:10.183 "num_base_bdevs_discovered": 2, 00:18:10.183 "num_base_bdevs_operational": 4, 00:18:10.183 "base_bdevs_list": [ 00:18:10.183 { 00:18:10.183 "name": "BaseBdev1", 00:18:10.183 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:10.183 "is_configured": true, 00:18:10.183 "data_offset": 2048, 00:18:10.183 "data_size": 63488 00:18:10.183 }, 00:18:10.183 { 00:18:10.183 "name": null, 00:18:10.183 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:10.183 "is_configured": false, 00:18:10.183 "data_offset": 2048, 00:18:10.183 "data_size": 63488 00:18:10.183 }, 00:18:10.183 { 00:18:10.183 "name": null, 00:18:10.183 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:10.183 "is_configured": false, 00:18:10.183 "data_offset": 2048, 00:18:10.183 "data_size": 63488 00:18:10.183 }, 00:18:10.183 { 00:18:10.183 "name": "BaseBdev4", 00:18:10.183 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:10.183 "is_configured": true, 00:18:10.183 "data_offset": 2048, 00:18:10.183 "data_size": 63488 00:18:10.183 } 00:18:10.183 ] 00:18:10.183 }' 00:18:10.183 13:26:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:10.183 13:26:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:11.122 13:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.122 13:26:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:11.692 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:11.692 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:11.952 [2024-07-25 13:26:52.589789] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.952 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:12.212 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:12.212 "name": "Existed_Raid", 00:18:12.212 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:12.212 "strip_size_kb": 64, 00:18:12.212 "state": "configuring", 00:18:12.212 "raid_level": "raid0", 00:18:12.212 "superblock": true, 00:18:12.212 "num_base_bdevs": 4, 00:18:12.212 "num_base_bdevs_discovered": 3, 00:18:12.212 "num_base_bdevs_operational": 4, 00:18:12.212 "base_bdevs_list": [ 00:18:12.212 { 00:18:12.212 "name": "BaseBdev1", 00:18:12.212 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:12.212 "is_configured": true, 00:18:12.212 "data_offset": 2048, 00:18:12.212 "data_size": 63488 00:18:12.212 }, 00:18:12.212 { 00:18:12.212 "name": null, 00:18:12.212 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:12.212 "is_configured": false, 00:18:12.212 "data_offset": 2048, 00:18:12.212 "data_size": 63488 00:18:12.212 }, 00:18:12.212 { 00:18:12.212 "name": "BaseBdev3", 00:18:12.212 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:12.212 "is_configured": true, 00:18:12.212 "data_offset": 2048, 00:18:12.212 "data_size": 63488 00:18:12.212 }, 00:18:12.212 { 00:18:12.212 "name": "BaseBdev4", 00:18:12.212 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:12.212 "is_configured": true, 00:18:12.212 "data_offset": 2048, 00:18:12.212 "data_size": 63488 00:18:12.212 } 00:18:12.212 ] 00:18:12.212 }' 00:18:12.212 13:26:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:12.212 13:26:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:13.149 13:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.149 13:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:13.409 13:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:13.409 13:26:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:13.409 [2024-07-25 13:26:54.137721] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:13.409 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:13.409 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.410 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:13.669 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:13.669 "name": "Existed_Raid", 00:18:13.669 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:13.669 "strip_size_kb": 64, 00:18:13.669 "state": "configuring", 00:18:13.669 "raid_level": "raid0", 00:18:13.669 "superblock": true, 00:18:13.669 "num_base_bdevs": 4, 00:18:13.669 "num_base_bdevs_discovered": 2, 00:18:13.669 "num_base_bdevs_operational": 4, 00:18:13.669 "base_bdevs_list": [ 00:18:13.669 { 00:18:13.669 "name": null, 00:18:13.669 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:13.669 "is_configured": false, 00:18:13.669 "data_offset": 2048, 00:18:13.669 "data_size": 63488 00:18:13.669 }, 00:18:13.669 { 00:18:13.669 "name": null, 00:18:13.669 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:13.669 "is_configured": false, 00:18:13.669 "data_offset": 2048, 00:18:13.669 "data_size": 63488 00:18:13.669 }, 00:18:13.669 { 00:18:13.669 "name": "BaseBdev3", 00:18:13.669 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:13.669 "is_configured": true, 00:18:13.669 "data_offset": 2048, 00:18:13.669 "data_size": 63488 00:18:13.669 }, 00:18:13.669 { 00:18:13.669 "name": "BaseBdev4", 00:18:13.669 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:13.669 "is_configured": true, 00:18:13.669 "data_offset": 2048, 00:18:13.669 "data_size": 63488 00:18:13.669 } 00:18:13.669 ] 00:18:13.669 }' 00:18:13.669 13:26:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:13.669 13:26:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:14.608 13:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.608 13:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:14.868 13:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:14.868 13:26:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:15.437 [2024-07-25 13:26:56.036564] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:15.437 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:16.007 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:16.007 "name": "Existed_Raid", 00:18:16.007 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:16.007 "strip_size_kb": 64, 00:18:16.007 "state": "configuring", 00:18:16.007 "raid_level": "raid0", 00:18:16.007 "superblock": true, 00:18:16.007 "num_base_bdevs": 4, 00:18:16.007 "num_base_bdevs_discovered": 3, 00:18:16.007 "num_base_bdevs_operational": 4, 00:18:16.007 "base_bdevs_list": [ 00:18:16.007 { 00:18:16.007 "name": null, 00:18:16.007 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:16.007 "is_configured": false, 00:18:16.007 "data_offset": 2048, 00:18:16.007 "data_size": 63488 00:18:16.007 }, 00:18:16.007 { 00:18:16.007 "name": "BaseBdev2", 00:18:16.007 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:16.007 "is_configured": true, 00:18:16.007 "data_offset": 2048, 00:18:16.007 "data_size": 63488 00:18:16.007 }, 00:18:16.007 { 00:18:16.007 "name": "BaseBdev3", 00:18:16.007 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:16.007 "is_configured": true, 00:18:16.007 "data_offset": 2048, 00:18:16.007 "data_size": 63488 00:18:16.007 }, 00:18:16.007 { 00:18:16.007 "name": "BaseBdev4", 00:18:16.007 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:16.007 "is_configured": true, 00:18:16.007 "data_offset": 2048, 00:18:16.007 "data_size": 63488 00:18:16.007 } 00:18:16.007 ] 00:18:16.007 }' 00:18:16.007 13:26:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:16.007 13:26:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:16.577 13:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.577 13:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:16.577 13:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:16.837 13:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.837 13:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:17.407 13:26:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 9a292432-b6dc-490c-ae1d-255750554ac2 00:18:17.666 [2024-07-25 13:26:58.431615] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:17.666 [2024-07-25 13:26:58.431741] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x11fbc70 00:18:17.666 [2024-07-25 13:26:58.431749] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:17.666 [2024-07-25 13:26:58.431890] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13a10c0 00:18:17.666 [2024-07-25 13:26:58.431979] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11fbc70 00:18:17.666 [2024-07-25 13:26:58.431984] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x11fbc70 00:18:17.666 [2024-07-25 13:26:58.432053] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:17.666 NewBaseBdev 00:18:17.926 13:26:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:17.926 13:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:18:17.926 13:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:18:17.926 13:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:18:17.926 13:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:18:17.926 13:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:18:17.926 13:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:18.495 13:26:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:18.754 [ 00:18:18.754 { 00:18:18.754 "name": "NewBaseBdev", 00:18:18.754 "aliases": [ 00:18:18.754 "9a292432-b6dc-490c-ae1d-255750554ac2" 00:18:18.754 ], 00:18:18.754 "product_name": "Malloc disk", 00:18:18.754 "block_size": 512, 00:18:18.754 "num_blocks": 65536, 00:18:18.754 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:18.754 "assigned_rate_limits": { 00:18:18.754 "rw_ios_per_sec": 0, 00:18:18.754 "rw_mbytes_per_sec": 0, 00:18:18.754 "r_mbytes_per_sec": 0, 00:18:18.754 "w_mbytes_per_sec": 0 00:18:18.754 }, 00:18:18.754 "claimed": true, 00:18:18.754 "claim_type": "exclusive_write", 00:18:18.754 "zoned": false, 00:18:18.754 "supported_io_types": { 00:18:18.754 "read": true, 00:18:18.754 "write": true, 00:18:18.754 "unmap": true, 00:18:18.754 "flush": true, 00:18:18.754 "reset": true, 00:18:18.754 "nvme_admin": false, 00:18:18.754 "nvme_io": false, 00:18:18.754 "nvme_io_md": false, 00:18:18.754 "write_zeroes": true, 00:18:18.755 "zcopy": true, 00:18:18.755 "get_zone_info": false, 00:18:18.755 "zone_management": false, 00:18:18.755 "zone_append": false, 00:18:18.755 "compare": false, 00:18:18.755 "compare_and_write": false, 00:18:18.755 "abort": true, 00:18:18.755 "seek_hole": false, 00:18:18.755 "seek_data": false, 00:18:18.755 "copy": true, 00:18:18.755 "nvme_iov_md": false 00:18:18.755 }, 00:18:18.755 "memory_domains": [ 00:18:18.755 { 00:18:18.755 "dma_device_id": "system", 00:18:18.755 "dma_device_type": 1 00:18:18.755 }, 00:18:18.755 { 00:18:18.755 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:18.755 "dma_device_type": 2 00:18:18.755 } 00:18:18.755 ], 00:18:18.755 "driver_specific": {} 00:18:18.755 } 00:18:18.755 ] 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:18.755 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.014 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.014 13:26:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:19.584 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:19.584 "name": "Existed_Raid", 00:18:19.584 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:19.584 "strip_size_kb": 64, 00:18:19.584 "state": "online", 00:18:19.584 "raid_level": "raid0", 00:18:19.584 "superblock": true, 00:18:19.584 "num_base_bdevs": 4, 00:18:19.584 "num_base_bdevs_discovered": 4, 00:18:19.584 "num_base_bdevs_operational": 4, 00:18:19.584 "base_bdevs_list": [ 00:18:19.584 { 00:18:19.584 "name": "NewBaseBdev", 00:18:19.584 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:19.584 "is_configured": true, 00:18:19.584 "data_offset": 2048, 00:18:19.584 "data_size": 63488 00:18:19.584 }, 00:18:19.584 { 00:18:19.584 "name": "BaseBdev2", 00:18:19.584 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:19.584 "is_configured": true, 00:18:19.584 "data_offset": 2048, 00:18:19.584 "data_size": 63488 00:18:19.584 }, 00:18:19.584 { 00:18:19.584 "name": "BaseBdev3", 00:18:19.584 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:19.584 "is_configured": true, 00:18:19.584 "data_offset": 2048, 00:18:19.584 "data_size": 63488 00:18:19.584 }, 00:18:19.584 { 00:18:19.584 "name": "BaseBdev4", 00:18:19.584 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:19.584 "is_configured": true, 00:18:19.584 "data_offset": 2048, 00:18:19.584 "data_size": 63488 00:18:19.584 } 00:18:19.584 ] 00:18:19.584 }' 00:18:19.584 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:19.584 13:27:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:20.154 [2024-07-25 13:27:00.829964] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:20.154 "name": "Existed_Raid", 00:18:20.154 "aliases": [ 00:18:20.154 "f91a3639-1cd1-41bf-b027-1e09440f07f1" 00:18:20.154 ], 00:18:20.154 "product_name": "Raid Volume", 00:18:20.154 "block_size": 512, 00:18:20.154 "num_blocks": 253952, 00:18:20.154 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:20.154 "assigned_rate_limits": { 00:18:20.154 "rw_ios_per_sec": 0, 00:18:20.154 "rw_mbytes_per_sec": 0, 00:18:20.154 "r_mbytes_per_sec": 0, 00:18:20.154 "w_mbytes_per_sec": 0 00:18:20.154 }, 00:18:20.154 "claimed": false, 00:18:20.154 "zoned": false, 00:18:20.154 "supported_io_types": { 00:18:20.154 "read": true, 00:18:20.154 "write": true, 00:18:20.154 "unmap": true, 00:18:20.154 "flush": true, 00:18:20.154 "reset": true, 00:18:20.154 "nvme_admin": false, 00:18:20.154 "nvme_io": false, 00:18:20.154 "nvme_io_md": false, 00:18:20.154 "write_zeroes": true, 00:18:20.154 "zcopy": false, 00:18:20.154 "get_zone_info": false, 00:18:20.154 "zone_management": false, 00:18:20.154 "zone_append": false, 00:18:20.154 "compare": false, 00:18:20.154 "compare_and_write": false, 00:18:20.154 "abort": false, 00:18:20.154 "seek_hole": false, 00:18:20.154 "seek_data": false, 00:18:20.154 "copy": false, 00:18:20.154 "nvme_iov_md": false 00:18:20.154 }, 00:18:20.154 "memory_domains": [ 00:18:20.154 { 00:18:20.154 "dma_device_id": "system", 00:18:20.154 "dma_device_type": 1 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.154 "dma_device_type": 2 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "dma_device_id": "system", 00:18:20.154 "dma_device_type": 1 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.154 "dma_device_type": 2 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "dma_device_id": "system", 00:18:20.154 "dma_device_type": 1 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.154 "dma_device_type": 2 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "dma_device_id": "system", 00:18:20.154 "dma_device_type": 1 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.154 "dma_device_type": 2 00:18:20.154 } 00:18:20.154 ], 00:18:20.154 "driver_specific": { 00:18:20.154 "raid": { 00:18:20.154 "uuid": "f91a3639-1cd1-41bf-b027-1e09440f07f1", 00:18:20.154 "strip_size_kb": 64, 00:18:20.154 "state": "online", 00:18:20.154 "raid_level": "raid0", 00:18:20.154 "superblock": true, 00:18:20.154 "num_base_bdevs": 4, 00:18:20.154 "num_base_bdevs_discovered": 4, 00:18:20.154 "num_base_bdevs_operational": 4, 00:18:20.154 "base_bdevs_list": [ 00:18:20.154 { 00:18:20.154 "name": "NewBaseBdev", 00:18:20.154 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:20.154 "is_configured": true, 00:18:20.154 "data_offset": 2048, 00:18:20.154 "data_size": 63488 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "name": "BaseBdev2", 00:18:20.154 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:20.154 "is_configured": true, 00:18:20.154 "data_offset": 2048, 00:18:20.154 "data_size": 63488 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "name": "BaseBdev3", 00:18:20.154 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:20.154 "is_configured": true, 00:18:20.154 "data_offset": 2048, 00:18:20.154 "data_size": 63488 00:18:20.154 }, 00:18:20.154 { 00:18:20.154 "name": "BaseBdev4", 00:18:20.154 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:20.154 "is_configured": true, 00:18:20.154 "data_offset": 2048, 00:18:20.154 "data_size": 63488 00:18:20.154 } 00:18:20.154 ] 00:18:20.154 } 00:18:20.154 } 00:18:20.154 }' 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:20.154 BaseBdev2 00:18:20.154 BaseBdev3 00:18:20.154 BaseBdev4' 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:20.154 13:27:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:20.413 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:20.413 "name": "NewBaseBdev", 00:18:20.413 "aliases": [ 00:18:20.413 "9a292432-b6dc-490c-ae1d-255750554ac2" 00:18:20.413 ], 00:18:20.413 "product_name": "Malloc disk", 00:18:20.413 "block_size": 512, 00:18:20.413 "num_blocks": 65536, 00:18:20.413 "uuid": "9a292432-b6dc-490c-ae1d-255750554ac2", 00:18:20.413 "assigned_rate_limits": { 00:18:20.413 "rw_ios_per_sec": 0, 00:18:20.413 "rw_mbytes_per_sec": 0, 00:18:20.413 "r_mbytes_per_sec": 0, 00:18:20.413 "w_mbytes_per_sec": 0 00:18:20.413 }, 00:18:20.413 "claimed": true, 00:18:20.413 "claim_type": "exclusive_write", 00:18:20.413 "zoned": false, 00:18:20.413 "supported_io_types": { 00:18:20.413 "read": true, 00:18:20.413 "write": true, 00:18:20.413 "unmap": true, 00:18:20.413 "flush": true, 00:18:20.413 "reset": true, 00:18:20.413 "nvme_admin": false, 00:18:20.413 "nvme_io": false, 00:18:20.413 "nvme_io_md": false, 00:18:20.413 "write_zeroes": true, 00:18:20.413 "zcopy": true, 00:18:20.413 "get_zone_info": false, 00:18:20.413 "zone_management": false, 00:18:20.413 "zone_append": false, 00:18:20.413 "compare": false, 00:18:20.413 "compare_and_write": false, 00:18:20.414 "abort": true, 00:18:20.414 "seek_hole": false, 00:18:20.414 "seek_data": false, 00:18:20.414 "copy": true, 00:18:20.414 "nvme_iov_md": false 00:18:20.414 }, 00:18:20.414 "memory_domains": [ 00:18:20.414 { 00:18:20.414 "dma_device_id": "system", 00:18:20.414 "dma_device_type": 1 00:18:20.414 }, 00:18:20.414 { 00:18:20.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.414 "dma_device_type": 2 00:18:20.414 } 00:18:20.414 ], 00:18:20.414 "driver_specific": {} 00:18:20.414 }' 00:18:20.414 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.414 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.673 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:20.933 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:20.933 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:20.933 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:20.933 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:20.933 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:20.933 "name": "BaseBdev2", 00:18:20.933 "aliases": [ 00:18:20.933 "89941acf-3be3-47c8-b6d7-13d89076bc1b" 00:18:20.933 ], 00:18:20.933 "product_name": "Malloc disk", 00:18:20.933 "block_size": 512, 00:18:20.933 "num_blocks": 65536, 00:18:20.933 "uuid": "89941acf-3be3-47c8-b6d7-13d89076bc1b", 00:18:20.933 "assigned_rate_limits": { 00:18:20.933 "rw_ios_per_sec": 0, 00:18:20.933 "rw_mbytes_per_sec": 0, 00:18:20.933 "r_mbytes_per_sec": 0, 00:18:20.933 "w_mbytes_per_sec": 0 00:18:20.933 }, 00:18:20.933 "claimed": true, 00:18:20.933 "claim_type": "exclusive_write", 00:18:20.933 "zoned": false, 00:18:20.933 "supported_io_types": { 00:18:20.933 "read": true, 00:18:20.933 "write": true, 00:18:20.933 "unmap": true, 00:18:20.933 "flush": true, 00:18:20.933 "reset": true, 00:18:20.933 "nvme_admin": false, 00:18:20.934 "nvme_io": false, 00:18:20.934 "nvme_io_md": false, 00:18:20.934 "write_zeroes": true, 00:18:20.934 "zcopy": true, 00:18:20.934 "get_zone_info": false, 00:18:20.934 "zone_management": false, 00:18:20.934 "zone_append": false, 00:18:20.934 "compare": false, 00:18:20.934 "compare_and_write": false, 00:18:20.934 "abort": true, 00:18:20.934 "seek_hole": false, 00:18:20.934 "seek_data": false, 00:18:20.934 "copy": true, 00:18:20.934 "nvme_iov_md": false 00:18:20.934 }, 00:18:20.934 "memory_domains": [ 00:18:20.934 { 00:18:20.934 "dma_device_id": "system", 00:18:20.934 "dma_device_type": 1 00:18:20.934 }, 00:18:20.934 { 00:18:20.934 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:20.934 "dma_device_type": 2 00:18:20.934 } 00:18:20.934 ], 00:18:20.934 "driver_specific": {} 00:18:20.934 }' 00:18:20.934 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.193 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.193 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:21.193 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.193 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:21.193 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:21.193 13:27:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:21.478 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:21.739 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:21.739 "name": "BaseBdev3", 00:18:21.739 "aliases": [ 00:18:21.739 "23bd3111-21c1-4df1-830c-0770e75f7938" 00:18:21.739 ], 00:18:21.739 "product_name": "Malloc disk", 00:18:21.739 "block_size": 512, 00:18:21.739 "num_blocks": 65536, 00:18:21.739 "uuid": "23bd3111-21c1-4df1-830c-0770e75f7938", 00:18:21.739 "assigned_rate_limits": { 00:18:21.739 "rw_ios_per_sec": 0, 00:18:21.739 "rw_mbytes_per_sec": 0, 00:18:21.739 "r_mbytes_per_sec": 0, 00:18:21.739 "w_mbytes_per_sec": 0 00:18:21.739 }, 00:18:21.739 "claimed": true, 00:18:21.739 "claim_type": "exclusive_write", 00:18:21.739 "zoned": false, 00:18:21.739 "supported_io_types": { 00:18:21.739 "read": true, 00:18:21.739 "write": true, 00:18:21.739 "unmap": true, 00:18:21.739 "flush": true, 00:18:21.739 "reset": true, 00:18:21.739 "nvme_admin": false, 00:18:21.739 "nvme_io": false, 00:18:21.739 "nvme_io_md": false, 00:18:21.739 "write_zeroes": true, 00:18:21.739 "zcopy": true, 00:18:21.739 "get_zone_info": false, 00:18:21.739 "zone_management": false, 00:18:21.739 "zone_append": false, 00:18:21.739 "compare": false, 00:18:21.739 "compare_and_write": false, 00:18:21.739 "abort": true, 00:18:21.739 "seek_hole": false, 00:18:21.739 "seek_data": false, 00:18:21.739 "copy": true, 00:18:21.739 "nvme_iov_md": false 00:18:21.739 }, 00:18:21.739 "memory_domains": [ 00:18:21.739 { 00:18:21.739 "dma_device_id": "system", 00:18:21.739 "dma_device_type": 1 00:18:21.739 }, 00:18:21.739 { 00:18:21.739 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:21.739 "dma_device_type": 2 00:18:21.739 } 00:18:21.739 ], 00:18:21.739 "driver_specific": {} 00:18:21.739 }' 00:18:21.739 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.739 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:21.739 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:21.739 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:22.000 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:22.261 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:22.261 "name": "BaseBdev4", 00:18:22.261 "aliases": [ 00:18:22.261 "27d9c08f-5dc9-4b39-8f7b-37519811fafb" 00:18:22.261 ], 00:18:22.261 "product_name": "Malloc disk", 00:18:22.261 "block_size": 512, 00:18:22.261 "num_blocks": 65536, 00:18:22.261 "uuid": "27d9c08f-5dc9-4b39-8f7b-37519811fafb", 00:18:22.261 "assigned_rate_limits": { 00:18:22.261 "rw_ios_per_sec": 0, 00:18:22.261 "rw_mbytes_per_sec": 0, 00:18:22.261 "r_mbytes_per_sec": 0, 00:18:22.261 "w_mbytes_per_sec": 0 00:18:22.261 }, 00:18:22.261 "claimed": true, 00:18:22.261 "claim_type": "exclusive_write", 00:18:22.261 "zoned": false, 00:18:22.261 "supported_io_types": { 00:18:22.261 "read": true, 00:18:22.261 "write": true, 00:18:22.261 "unmap": true, 00:18:22.261 "flush": true, 00:18:22.261 "reset": true, 00:18:22.261 "nvme_admin": false, 00:18:22.261 "nvme_io": false, 00:18:22.261 "nvme_io_md": false, 00:18:22.261 "write_zeroes": true, 00:18:22.261 "zcopy": true, 00:18:22.261 "get_zone_info": false, 00:18:22.261 "zone_management": false, 00:18:22.261 "zone_append": false, 00:18:22.261 "compare": false, 00:18:22.261 "compare_and_write": false, 00:18:22.261 "abort": true, 00:18:22.261 "seek_hole": false, 00:18:22.261 "seek_data": false, 00:18:22.261 "copy": true, 00:18:22.261 "nvme_iov_md": false 00:18:22.261 }, 00:18:22.261 "memory_domains": [ 00:18:22.261 { 00:18:22.261 "dma_device_id": "system", 00:18:22.261 "dma_device_type": 1 00:18:22.261 }, 00:18:22.261 { 00:18:22.261 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:22.261 "dma_device_type": 2 00:18:22.261 } 00:18:22.261 ], 00:18:22.261 "driver_specific": {} 00:18:22.261 }' 00:18:22.261 13:27:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.261 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:22.521 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:22.521 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.521 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:22.521 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:22.521 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.521 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:22.779 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:22.779 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.779 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:22.779 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:22.780 13:27:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:23.349 [2024-07-25 13:27:04.021792] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:23.349 [2024-07-25 13:27:04.021814] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:23.349 [2024-07-25 13:27:04.021854] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:23.349 [2024-07-25 13:27:04.021899] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:23.349 [2024-07-25 13:27:04.021906] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11fbc70 name Existed_Raid, state offline 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 946342 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 946342 ']' 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 946342 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 946342 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 946342' 00:18:23.349 killing process with pid 946342 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 946342 00:18:23.349 [2024-07-25 13:27:04.107098] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:23.349 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 946342 00:18:23.349 [2024-07-25 13:27:04.127799] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:23.609 13:27:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:18:23.609 00:18:23.609 real 0m33.971s 00:18:23.609 user 1m4.187s 00:18:23.609 sys 0m4.630s 00:18:23.609 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:23.609 13:27:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:23.609 ************************************ 00:18:23.609 END TEST raid_state_function_test_sb 00:18:23.609 ************************************ 00:18:23.609 13:27:04 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:18:23.609 13:27:04 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:18:23.609 13:27:04 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:23.609 13:27:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:23.609 ************************************ 00:18:23.609 START TEST raid_superblock_test 00:18:23.609 ************************************ 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid0 4 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid0 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid0 '!=' raid1 ']' 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=952902 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 952902 /var/tmp/spdk-raid.sock 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 952902 ']' 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:23.609 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.609 13:27:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:18:23.609 [2024-07-25 13:27:04.378380] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:18:23.609 [2024-07-25 13:27:04.378428] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid952902 ] 00:18:23.870 [2024-07-25 13:27:04.465623] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:23.870 [2024-07-25 13:27:04.528224] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:23.870 [2024-07-25 13:27:04.567400] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:23.870 [2024-07-25 13:27:04.567424] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:24.809 13:27:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:18:25.378 malloc1 00:18:25.378 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:25.947 [2024-07-25 13:27:06.616408] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:25.947 [2024-07-25 13:27:06.616443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:25.947 [2024-07-25 13:27:06.616456] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13709b0 00:18:25.947 [2024-07-25 13:27:06.616462] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:25.947 [2024-07-25 13:27:06.617771] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:25.947 [2024-07-25 13:27:06.617792] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:25.947 pt1 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:25.947 13:27:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:18:26.517 malloc2 00:18:26.517 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:27.086 [2024-07-25 13:27:07.701125] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:27.086 [2024-07-25 13:27:07.701156] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:27.086 [2024-07-25 13:27:07.701169] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1371db0 00:18:27.086 [2024-07-25 13:27:07.701176] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:27.086 [2024-07-25 13:27:07.702411] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:27.086 [2024-07-25 13:27:07.702431] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:27.086 pt2 00:18:27.086 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:27.086 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:27.086 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:18:27.086 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:18:27.087 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:18:27.087 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:27.087 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:27.087 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:27.087 13:27:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:18:27.655 malloc3 00:18:27.655 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:28.223 [2024-07-25 13:27:08.785699] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:28.223 [2024-07-25 13:27:08.785730] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.223 [2024-07-25 13:27:08.785740] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1508780 00:18:28.223 [2024-07-25 13:27:08.785746] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.223 [2024-07-25 13:27:08.786937] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.223 [2024-07-25 13:27:08.786956] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:28.223 pt3 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:18:28.223 13:27:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:18:28.223 malloc4 00:18:28.483 13:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:28.742 [2024-07-25 13:27:09.525412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:28.743 [2024-07-25 13:27:09.525443] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:28.743 [2024-07-25 13:27:09.525454] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x150b0a0 00:18:28.743 [2024-07-25 13:27:09.525460] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:28.743 [2024-07-25 13:27:09.526666] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:28.743 [2024-07-25 13:27:09.526690] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:28.743 pt4 00:18:29.007 13:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:18:29.007 13:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:18:29.007 13:27:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:18:29.335 [2024-07-25 13:27:10.071163] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:29.335 [2024-07-25 13:27:10.072207] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:29.335 [2024-07-25 13:27:10.072250] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:29.335 [2024-07-25 13:27:10.072283] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:29.335 [2024-07-25 13:27:10.072410] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1368970 00:18:29.335 [2024-07-25 13:27:10.072417] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:29.335 [2024-07-25 13:27:10.072590] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1366bb0 00:18:29.335 [2024-07-25 13:27:10.072703] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1368970 00:18:29.335 [2024-07-25 13:27:10.072708] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1368970 00:18:29.335 [2024-07-25 13:27:10.072793] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:29.335 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:29.595 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:29.595 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:29.595 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:29.595 "name": "raid_bdev1", 00:18:29.595 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:29.595 "strip_size_kb": 64, 00:18:29.595 "state": "online", 00:18:29.595 "raid_level": "raid0", 00:18:29.595 "superblock": true, 00:18:29.595 "num_base_bdevs": 4, 00:18:29.595 "num_base_bdevs_discovered": 4, 00:18:29.595 "num_base_bdevs_operational": 4, 00:18:29.595 "base_bdevs_list": [ 00:18:29.595 { 00:18:29.595 "name": "pt1", 00:18:29.595 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:29.595 "is_configured": true, 00:18:29.595 "data_offset": 2048, 00:18:29.595 "data_size": 63488 00:18:29.595 }, 00:18:29.595 { 00:18:29.595 "name": "pt2", 00:18:29.595 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:29.595 "is_configured": true, 00:18:29.595 "data_offset": 2048, 00:18:29.595 "data_size": 63488 00:18:29.595 }, 00:18:29.595 { 00:18:29.595 "name": "pt3", 00:18:29.595 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:29.595 "is_configured": true, 00:18:29.595 "data_offset": 2048, 00:18:29.595 "data_size": 63488 00:18:29.595 }, 00:18:29.595 { 00:18:29.595 "name": "pt4", 00:18:29.595 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:29.595 "is_configured": true, 00:18:29.595 "data_offset": 2048, 00:18:29.595 "data_size": 63488 00:18:29.595 } 00:18:29.595 ] 00:18:29.595 }' 00:18:29.595 13:27:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:29.595 13:27:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:30.534 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:30.794 [2024-07-25 13:27:11.362695] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:30.794 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:30.794 "name": "raid_bdev1", 00:18:30.794 "aliases": [ 00:18:30.794 "53ccac53-6fbe-4ffc-93b5-10d58e626bc0" 00:18:30.794 ], 00:18:30.794 "product_name": "Raid Volume", 00:18:30.794 "block_size": 512, 00:18:30.794 "num_blocks": 253952, 00:18:30.794 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:30.794 "assigned_rate_limits": { 00:18:30.794 "rw_ios_per_sec": 0, 00:18:30.794 "rw_mbytes_per_sec": 0, 00:18:30.794 "r_mbytes_per_sec": 0, 00:18:30.794 "w_mbytes_per_sec": 0 00:18:30.794 }, 00:18:30.794 "claimed": false, 00:18:30.794 "zoned": false, 00:18:30.794 "supported_io_types": { 00:18:30.794 "read": true, 00:18:30.794 "write": true, 00:18:30.794 "unmap": true, 00:18:30.794 "flush": true, 00:18:30.794 "reset": true, 00:18:30.794 "nvme_admin": false, 00:18:30.794 "nvme_io": false, 00:18:30.794 "nvme_io_md": false, 00:18:30.794 "write_zeroes": true, 00:18:30.794 "zcopy": false, 00:18:30.794 "get_zone_info": false, 00:18:30.794 "zone_management": false, 00:18:30.794 "zone_append": false, 00:18:30.794 "compare": false, 00:18:30.794 "compare_and_write": false, 00:18:30.794 "abort": false, 00:18:30.794 "seek_hole": false, 00:18:30.794 "seek_data": false, 00:18:30.794 "copy": false, 00:18:30.794 "nvme_iov_md": false 00:18:30.794 }, 00:18:30.794 "memory_domains": [ 00:18:30.794 { 00:18:30.794 "dma_device_id": "system", 00:18:30.794 "dma_device_type": 1 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.794 "dma_device_type": 2 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "dma_device_id": "system", 00:18:30.794 "dma_device_type": 1 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.794 "dma_device_type": 2 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "dma_device_id": "system", 00:18:30.794 "dma_device_type": 1 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.794 "dma_device_type": 2 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "dma_device_id": "system", 00:18:30.794 "dma_device_type": 1 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.794 "dma_device_type": 2 00:18:30.794 } 00:18:30.794 ], 00:18:30.794 "driver_specific": { 00:18:30.794 "raid": { 00:18:30.794 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:30.794 "strip_size_kb": 64, 00:18:30.794 "state": "online", 00:18:30.794 "raid_level": "raid0", 00:18:30.794 "superblock": true, 00:18:30.794 "num_base_bdevs": 4, 00:18:30.794 "num_base_bdevs_discovered": 4, 00:18:30.794 "num_base_bdevs_operational": 4, 00:18:30.794 "base_bdevs_list": [ 00:18:30.794 { 00:18:30.794 "name": "pt1", 00:18:30.794 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:30.794 "is_configured": true, 00:18:30.794 "data_offset": 2048, 00:18:30.794 "data_size": 63488 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "name": "pt2", 00:18:30.794 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:30.794 "is_configured": true, 00:18:30.794 "data_offset": 2048, 00:18:30.794 "data_size": 63488 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "name": "pt3", 00:18:30.794 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:30.794 "is_configured": true, 00:18:30.794 "data_offset": 2048, 00:18:30.794 "data_size": 63488 00:18:30.794 }, 00:18:30.794 { 00:18:30.794 "name": "pt4", 00:18:30.794 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:30.794 "is_configured": true, 00:18:30.794 "data_offset": 2048, 00:18:30.794 "data_size": 63488 00:18:30.794 } 00:18:30.794 ] 00:18:30.794 } 00:18:30.794 } 00:18:30.794 }' 00:18:30.794 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:30.794 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:30.794 pt2 00:18:30.794 pt3 00:18:30.794 pt4' 00:18:30.794 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.794 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:30.794 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.054 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:31.054 "name": "pt1", 00:18:31.054 "aliases": [ 00:18:31.054 "00000000-0000-0000-0000-000000000001" 00:18:31.054 ], 00:18:31.054 "product_name": "passthru", 00:18:31.054 "block_size": 512, 00:18:31.054 "num_blocks": 65536, 00:18:31.054 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:31.054 "assigned_rate_limits": { 00:18:31.054 "rw_ios_per_sec": 0, 00:18:31.054 "rw_mbytes_per_sec": 0, 00:18:31.054 "r_mbytes_per_sec": 0, 00:18:31.054 "w_mbytes_per_sec": 0 00:18:31.054 }, 00:18:31.054 "claimed": true, 00:18:31.054 "claim_type": "exclusive_write", 00:18:31.054 "zoned": false, 00:18:31.054 "supported_io_types": { 00:18:31.054 "read": true, 00:18:31.054 "write": true, 00:18:31.054 "unmap": true, 00:18:31.054 "flush": true, 00:18:31.054 "reset": true, 00:18:31.054 "nvme_admin": false, 00:18:31.054 "nvme_io": false, 00:18:31.054 "nvme_io_md": false, 00:18:31.054 "write_zeroes": true, 00:18:31.054 "zcopy": true, 00:18:31.054 "get_zone_info": false, 00:18:31.054 "zone_management": false, 00:18:31.054 "zone_append": false, 00:18:31.054 "compare": false, 00:18:31.054 "compare_and_write": false, 00:18:31.054 "abort": true, 00:18:31.054 "seek_hole": false, 00:18:31.054 "seek_data": false, 00:18:31.054 "copy": true, 00:18:31.054 "nvme_iov_md": false 00:18:31.054 }, 00:18:31.054 "memory_domains": [ 00:18:31.054 { 00:18:31.054 "dma_device_id": "system", 00:18:31.054 "dma_device_type": 1 00:18:31.054 }, 00:18:31.054 { 00:18:31.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.054 "dma_device_type": 2 00:18:31.054 } 00:18:31.054 ], 00:18:31.054 "driver_specific": { 00:18:31.054 "passthru": { 00:18:31.054 "name": "pt1", 00:18:31.054 "base_bdev_name": "malloc1" 00:18:31.054 } 00:18:31.054 } 00:18:31.054 }' 00:18:31.054 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.054 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.054 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.054 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.054 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.315 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.315 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.315 13:27:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.315 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.315 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.315 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.575 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.575 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:31.575 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:31.575 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:31.575 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:31.575 "name": "pt2", 00:18:31.575 "aliases": [ 00:18:31.575 "00000000-0000-0000-0000-000000000002" 00:18:31.575 ], 00:18:31.575 "product_name": "passthru", 00:18:31.575 "block_size": 512, 00:18:31.575 "num_blocks": 65536, 00:18:31.575 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:31.575 "assigned_rate_limits": { 00:18:31.575 "rw_ios_per_sec": 0, 00:18:31.575 "rw_mbytes_per_sec": 0, 00:18:31.575 "r_mbytes_per_sec": 0, 00:18:31.575 "w_mbytes_per_sec": 0 00:18:31.575 }, 00:18:31.575 "claimed": true, 00:18:31.575 "claim_type": "exclusive_write", 00:18:31.575 "zoned": false, 00:18:31.575 "supported_io_types": { 00:18:31.575 "read": true, 00:18:31.575 "write": true, 00:18:31.575 "unmap": true, 00:18:31.575 "flush": true, 00:18:31.575 "reset": true, 00:18:31.575 "nvme_admin": false, 00:18:31.575 "nvme_io": false, 00:18:31.575 "nvme_io_md": false, 00:18:31.575 "write_zeroes": true, 00:18:31.575 "zcopy": true, 00:18:31.575 "get_zone_info": false, 00:18:31.575 "zone_management": false, 00:18:31.575 "zone_append": false, 00:18:31.575 "compare": false, 00:18:31.575 "compare_and_write": false, 00:18:31.575 "abort": true, 00:18:31.575 "seek_hole": false, 00:18:31.575 "seek_data": false, 00:18:31.575 "copy": true, 00:18:31.575 "nvme_iov_md": false 00:18:31.575 }, 00:18:31.575 "memory_domains": [ 00:18:31.575 { 00:18:31.575 "dma_device_id": "system", 00:18:31.575 "dma_device_type": 1 00:18:31.575 }, 00:18:31.575 { 00:18:31.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:31.575 "dma_device_type": 2 00:18:31.575 } 00:18:31.575 ], 00:18:31.575 "driver_specific": { 00:18:31.575 "passthru": { 00:18:31.575 "name": "pt2", 00:18:31.575 "base_bdev_name": "malloc2" 00:18:31.575 } 00:18:31.575 } 00:18:31.575 }' 00:18:31.575 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.836 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:31.836 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:31.836 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.836 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.836 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.836 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:32.096 13:27:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:32.356 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.356 "name": "pt3", 00:18:32.356 "aliases": [ 00:18:32.356 "00000000-0000-0000-0000-000000000003" 00:18:32.356 ], 00:18:32.356 "product_name": "passthru", 00:18:32.356 "block_size": 512, 00:18:32.356 "num_blocks": 65536, 00:18:32.356 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:32.356 "assigned_rate_limits": { 00:18:32.356 "rw_ios_per_sec": 0, 00:18:32.356 "rw_mbytes_per_sec": 0, 00:18:32.356 "r_mbytes_per_sec": 0, 00:18:32.356 "w_mbytes_per_sec": 0 00:18:32.356 }, 00:18:32.356 "claimed": true, 00:18:32.356 "claim_type": "exclusive_write", 00:18:32.356 "zoned": false, 00:18:32.356 "supported_io_types": { 00:18:32.356 "read": true, 00:18:32.356 "write": true, 00:18:32.356 "unmap": true, 00:18:32.356 "flush": true, 00:18:32.356 "reset": true, 00:18:32.356 "nvme_admin": false, 00:18:32.356 "nvme_io": false, 00:18:32.356 "nvme_io_md": false, 00:18:32.356 "write_zeroes": true, 00:18:32.356 "zcopy": true, 00:18:32.356 "get_zone_info": false, 00:18:32.356 "zone_management": false, 00:18:32.356 "zone_append": false, 00:18:32.356 "compare": false, 00:18:32.356 "compare_and_write": false, 00:18:32.356 "abort": true, 00:18:32.356 "seek_hole": false, 00:18:32.356 "seek_data": false, 00:18:32.356 "copy": true, 00:18:32.356 "nvme_iov_md": false 00:18:32.356 }, 00:18:32.356 "memory_domains": [ 00:18:32.356 { 00:18:32.356 "dma_device_id": "system", 00:18:32.356 "dma_device_type": 1 00:18:32.356 }, 00:18:32.356 { 00:18:32.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.356 "dma_device_type": 2 00:18:32.356 } 00:18:32.356 ], 00:18:32.356 "driver_specific": { 00:18:32.356 "passthru": { 00:18:32.356 "name": "pt3", 00:18:32.356 "base_bdev_name": "malloc3" 00:18:32.356 } 00:18:32.356 } 00:18:32.356 }' 00:18:32.356 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.356 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.356 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:32.356 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:32.616 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:32.876 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:32.876 "name": "pt4", 00:18:32.876 "aliases": [ 00:18:32.876 "00000000-0000-0000-0000-000000000004" 00:18:32.876 ], 00:18:32.876 "product_name": "passthru", 00:18:32.876 "block_size": 512, 00:18:32.876 "num_blocks": 65536, 00:18:32.876 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:32.876 "assigned_rate_limits": { 00:18:32.876 "rw_ios_per_sec": 0, 00:18:32.876 "rw_mbytes_per_sec": 0, 00:18:32.876 "r_mbytes_per_sec": 0, 00:18:32.877 "w_mbytes_per_sec": 0 00:18:32.877 }, 00:18:32.877 "claimed": true, 00:18:32.877 "claim_type": "exclusive_write", 00:18:32.877 "zoned": false, 00:18:32.877 "supported_io_types": { 00:18:32.877 "read": true, 00:18:32.877 "write": true, 00:18:32.877 "unmap": true, 00:18:32.877 "flush": true, 00:18:32.877 "reset": true, 00:18:32.877 "nvme_admin": false, 00:18:32.877 "nvme_io": false, 00:18:32.877 "nvme_io_md": false, 00:18:32.877 "write_zeroes": true, 00:18:32.877 "zcopy": true, 00:18:32.877 "get_zone_info": false, 00:18:32.877 "zone_management": false, 00:18:32.877 "zone_append": false, 00:18:32.877 "compare": false, 00:18:32.877 "compare_and_write": false, 00:18:32.877 "abort": true, 00:18:32.877 "seek_hole": false, 00:18:32.877 "seek_data": false, 00:18:32.877 "copy": true, 00:18:32.877 "nvme_iov_md": false 00:18:32.877 }, 00:18:32.877 "memory_domains": [ 00:18:32.877 { 00:18:32.877 "dma_device_id": "system", 00:18:32.877 "dma_device_type": 1 00:18:32.877 }, 00:18:32.877 { 00:18:32.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:32.877 "dma_device_type": 2 00:18:32.877 } 00:18:32.877 ], 00:18:32.877 "driver_specific": { 00:18:32.877 "passthru": { 00:18:32.877 "name": "pt4", 00:18:32.877 "base_bdev_name": "malloc4" 00:18:32.877 } 00:18:32.877 } 00:18:32.877 }' 00:18:32.877 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:32.877 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.137 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:33.398 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:33.398 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:33.398 13:27:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:18:33.398 [2024-07-25 13:27:14.113664] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:33.398 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=53ccac53-6fbe-4ffc-93b5-10d58e626bc0 00:18:33.398 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 53ccac53-6fbe-4ffc-93b5-10d58e626bc0 ']' 00:18:33.398 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:33.967 [2024-07-25 13:27:14.638715] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:33.967 [2024-07-25 13:27:14.638731] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:33.967 [2024-07-25 13:27:14.638770] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:33.967 [2024-07-25 13:27:14.638816] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:33.967 [2024-07-25 13:27:14.638823] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1368970 name raid_bdev1, state offline 00:18:33.967 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.968 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:18:34.227 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:18:34.227 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:18:34.227 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:34.227 13:27:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:18:34.797 13:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:34.797 13:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:35.365 13:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:35.365 13:27:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:18:35.934 13:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:18:35.934 13:27:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:18:36.503 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:18:36.764 [2024-07-25 13:27:17.417613] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:18:36.764 [2024-07-25 13:27:17.418679] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:18:36.764 [2024-07-25 13:27:17.418713] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:18:36.764 [2024-07-25 13:27:17.418743] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:18:36.764 [2024-07-25 13:27:17.418778] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:18:36.764 [2024-07-25 13:27:17.418806] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:18:36.764 [2024-07-25 13:27:17.418820] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:18:36.764 [2024-07-25 13:27:17.418834] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:18:36.764 [2024-07-25 13:27:17.418843] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:36.764 [2024-07-25 13:27:17.418850] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1370e50 name raid_bdev1, state configuring 00:18:36.764 request: 00:18:36.764 { 00:18:36.764 "name": "raid_bdev1", 00:18:36.764 "raid_level": "raid0", 00:18:36.764 "base_bdevs": [ 00:18:36.764 "malloc1", 00:18:36.764 "malloc2", 00:18:36.764 "malloc3", 00:18:36.764 "malloc4" 00:18:36.764 ], 00:18:36.764 "strip_size_kb": 64, 00:18:36.764 "superblock": false, 00:18:36.764 "method": "bdev_raid_create", 00:18:36.764 "req_id": 1 00:18:36.764 } 00:18:36.764 Got JSON-RPC error response 00:18:36.764 response: 00:18:36.764 { 00:18:36.764 "code": -17, 00:18:36.764 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:18:36.764 } 00:18:36.764 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:18:36.764 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:18:36.764 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:18:36.764 13:27:17 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:18:36.764 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.764 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:18:37.334 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:18:37.334 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:18:37.334 13:27:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:18:37.904 [2024-07-25 13:27:18.488339] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:18:37.904 [2024-07-25 13:27:18.488368] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:37.904 [2024-07-25 13:27:18.488381] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1370be0 00:18:37.904 [2024-07-25 13:27:18.488388] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:37.904 [2024-07-25 13:27:18.489644] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:37.904 [2024-07-25 13:27:18.489665] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:18:37.904 [2024-07-25 13:27:18.489711] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:18:37.904 [2024-07-25 13:27:18.489731] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:18:37.904 pt1 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:37.904 13:27:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:38.473 13:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:38.473 "name": "raid_bdev1", 00:18:38.473 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:38.473 "strip_size_kb": 64, 00:18:38.473 "state": "configuring", 00:18:38.473 "raid_level": "raid0", 00:18:38.473 "superblock": true, 00:18:38.473 "num_base_bdevs": 4, 00:18:38.473 "num_base_bdevs_discovered": 1, 00:18:38.473 "num_base_bdevs_operational": 4, 00:18:38.473 "base_bdevs_list": [ 00:18:38.473 { 00:18:38.473 "name": "pt1", 00:18:38.473 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:38.473 "is_configured": true, 00:18:38.473 "data_offset": 2048, 00:18:38.473 "data_size": 63488 00:18:38.474 }, 00:18:38.474 { 00:18:38.474 "name": null, 00:18:38.474 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:38.474 "is_configured": false, 00:18:38.474 "data_offset": 2048, 00:18:38.474 "data_size": 63488 00:18:38.474 }, 00:18:38.474 { 00:18:38.474 "name": null, 00:18:38.474 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:38.474 "is_configured": false, 00:18:38.474 "data_offset": 2048, 00:18:38.474 "data_size": 63488 00:18:38.474 }, 00:18:38.474 { 00:18:38.474 "name": null, 00:18:38.474 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:38.474 "is_configured": false, 00:18:38.474 "data_offset": 2048, 00:18:38.474 "data_size": 63488 00:18:38.474 } 00:18:38.474 ] 00:18:38.474 }' 00:18:38.474 13:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:38.474 13:27:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:39.413 13:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:18:39.413 13:27:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:39.413 [2024-07-25 13:27:20.112485] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:39.413 [2024-07-25 13:27:20.112524] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:39.413 [2024-07-25 13:27:20.112536] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13688e0 00:18:39.413 [2024-07-25 13:27:20.112542] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:39.413 [2024-07-25 13:27:20.112824] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:39.413 [2024-07-25 13:27:20.112836] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:39.413 [2024-07-25 13:27:20.112881] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:39.413 [2024-07-25 13:27:20.112896] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:39.413 pt2 00:18:39.413 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:18:39.674 [2024-07-25 13:27:20.308979] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.674 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:40.008 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.008 "name": "raid_bdev1", 00:18:40.008 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:40.008 "strip_size_kb": 64, 00:18:40.009 "state": "configuring", 00:18:40.009 "raid_level": "raid0", 00:18:40.009 "superblock": true, 00:18:40.009 "num_base_bdevs": 4, 00:18:40.009 "num_base_bdevs_discovered": 1, 00:18:40.009 "num_base_bdevs_operational": 4, 00:18:40.009 "base_bdevs_list": [ 00:18:40.009 { 00:18:40.009 "name": "pt1", 00:18:40.009 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:40.009 "is_configured": true, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 }, 00:18:40.009 { 00:18:40.009 "name": null, 00:18:40.009 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:40.009 "is_configured": false, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 }, 00:18:40.009 { 00:18:40.009 "name": null, 00:18:40.009 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:40.009 "is_configured": false, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 }, 00:18:40.009 { 00:18:40.009 "name": null, 00:18:40.009 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:40.009 "is_configured": false, 00:18:40.009 "data_offset": 2048, 00:18:40.009 "data_size": 63488 00:18:40.009 } 00:18:40.009 ] 00:18:40.009 }' 00:18:40.009 13:27:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.009 13:27:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:40.579 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:18:40.579 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:40.579 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:18:40.579 [2024-07-25 13:27:21.219281] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:18:40.579 [2024-07-25 13:27:21.219310] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:40.579 [2024-07-25 13:27:21.219322] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x150b2d0 00:18:40.579 [2024-07-25 13:27:21.219329] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:40.579 [2024-07-25 13:27:21.219599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:40.579 [2024-07-25 13:27:21.219610] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:18:40.579 [2024-07-25 13:27:21.219654] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:18:40.579 [2024-07-25 13:27:21.219667] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:18:40.579 pt2 00:18:40.579 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:18:40.579 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:40.579 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:18:40.841 [2024-07-25 13:27:21.383697] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:18:40.841 [2024-07-25 13:27:21.383715] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:40.841 [2024-07-25 13:27:21.383723] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1369970 00:18:40.841 [2024-07-25 13:27:21.383729] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:40.841 [2024-07-25 13:27:21.383941] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:40.841 [2024-07-25 13:27:21.383951] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:18:40.841 [2024-07-25 13:27:21.383982] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:18:40.841 [2024-07-25 13:27:21.383992] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:18:40.841 pt3 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:18:40.841 [2024-07-25 13:27:21.536078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:18:40.841 [2024-07-25 13:27:21.536101] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:40.841 [2024-07-25 13:27:21.536110] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1367420 00:18:40.841 [2024-07-25 13:27:21.536115] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:40.841 [2024-07-25 13:27:21.536326] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:40.841 [2024-07-25 13:27:21.536336] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:18:40.841 [2024-07-25 13:27:21.536367] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:18:40.841 [2024-07-25 13:27:21.536377] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:18:40.841 [2024-07-25 13:27:21.536467] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x136bd00 00:18:40.841 [2024-07-25 13:27:21.536473] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:40.841 [2024-07-25 13:27:21.536610] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x136b240 00:18:40.841 [2024-07-25 13:27:21.536709] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x136bd00 00:18:40.841 [2024-07-25 13:27:21.536715] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x136bd00 00:18:40.841 [2024-07-25 13:27:21.536784] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.841 pt4 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:40.841 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:41.102 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.102 "name": "raid_bdev1", 00:18:41.102 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:41.102 "strip_size_kb": 64, 00:18:41.102 "state": "online", 00:18:41.102 "raid_level": "raid0", 00:18:41.102 "superblock": true, 00:18:41.102 "num_base_bdevs": 4, 00:18:41.102 "num_base_bdevs_discovered": 4, 00:18:41.102 "num_base_bdevs_operational": 4, 00:18:41.102 "base_bdevs_list": [ 00:18:41.102 { 00:18:41.102 "name": "pt1", 00:18:41.102 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:41.102 "is_configured": true, 00:18:41.102 "data_offset": 2048, 00:18:41.102 "data_size": 63488 00:18:41.102 }, 00:18:41.102 { 00:18:41.102 "name": "pt2", 00:18:41.102 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:41.102 "is_configured": true, 00:18:41.102 "data_offset": 2048, 00:18:41.102 "data_size": 63488 00:18:41.102 }, 00:18:41.102 { 00:18:41.102 "name": "pt3", 00:18:41.102 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:41.102 "is_configured": true, 00:18:41.102 "data_offset": 2048, 00:18:41.102 "data_size": 63488 00:18:41.102 }, 00:18:41.102 { 00:18:41.102 "name": "pt4", 00:18:41.102 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:41.102 "is_configured": true, 00:18:41.102 "data_offset": 2048, 00:18:41.102 "data_size": 63488 00:18:41.102 } 00:18:41.102 ] 00:18:41.102 }' 00:18:41.102 13:27:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.102 13:27:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:41.672 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:18:41.672 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:18:41.672 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:41.672 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:41.672 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:41.672 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:41.672 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:41.673 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:41.932 [2024-07-25 13:27:22.474702] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:41.932 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:41.932 "name": "raid_bdev1", 00:18:41.932 "aliases": [ 00:18:41.932 "53ccac53-6fbe-4ffc-93b5-10d58e626bc0" 00:18:41.932 ], 00:18:41.932 "product_name": "Raid Volume", 00:18:41.932 "block_size": 512, 00:18:41.932 "num_blocks": 253952, 00:18:41.932 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:41.932 "assigned_rate_limits": { 00:18:41.932 "rw_ios_per_sec": 0, 00:18:41.932 "rw_mbytes_per_sec": 0, 00:18:41.932 "r_mbytes_per_sec": 0, 00:18:41.932 "w_mbytes_per_sec": 0 00:18:41.932 }, 00:18:41.932 "claimed": false, 00:18:41.932 "zoned": false, 00:18:41.932 "supported_io_types": { 00:18:41.932 "read": true, 00:18:41.932 "write": true, 00:18:41.932 "unmap": true, 00:18:41.932 "flush": true, 00:18:41.932 "reset": true, 00:18:41.932 "nvme_admin": false, 00:18:41.932 "nvme_io": false, 00:18:41.932 "nvme_io_md": false, 00:18:41.932 "write_zeroes": true, 00:18:41.932 "zcopy": false, 00:18:41.932 "get_zone_info": false, 00:18:41.932 "zone_management": false, 00:18:41.932 "zone_append": false, 00:18:41.932 "compare": false, 00:18:41.932 "compare_and_write": false, 00:18:41.932 "abort": false, 00:18:41.932 "seek_hole": false, 00:18:41.932 "seek_data": false, 00:18:41.932 "copy": false, 00:18:41.932 "nvme_iov_md": false 00:18:41.932 }, 00:18:41.932 "memory_domains": [ 00:18:41.932 { 00:18:41.932 "dma_device_id": "system", 00:18:41.932 "dma_device_type": 1 00:18:41.932 }, 00:18:41.932 { 00:18:41.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.932 "dma_device_type": 2 00:18:41.932 }, 00:18:41.932 { 00:18:41.932 "dma_device_id": "system", 00:18:41.932 "dma_device_type": 1 00:18:41.932 }, 00:18:41.932 { 00:18:41.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.932 "dma_device_type": 2 00:18:41.932 }, 00:18:41.932 { 00:18:41.932 "dma_device_id": "system", 00:18:41.932 "dma_device_type": 1 00:18:41.932 }, 00:18:41.932 { 00:18:41.932 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.932 "dma_device_type": 2 00:18:41.932 }, 00:18:41.932 { 00:18:41.932 "dma_device_id": "system", 00:18:41.932 "dma_device_type": 1 00:18:41.933 }, 00:18:41.933 { 00:18:41.933 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.933 "dma_device_type": 2 00:18:41.933 } 00:18:41.933 ], 00:18:41.933 "driver_specific": { 00:18:41.933 "raid": { 00:18:41.933 "uuid": "53ccac53-6fbe-4ffc-93b5-10d58e626bc0", 00:18:41.933 "strip_size_kb": 64, 00:18:41.933 "state": "online", 00:18:41.933 "raid_level": "raid0", 00:18:41.933 "superblock": true, 00:18:41.933 "num_base_bdevs": 4, 00:18:41.933 "num_base_bdevs_discovered": 4, 00:18:41.933 "num_base_bdevs_operational": 4, 00:18:41.933 "base_bdevs_list": [ 00:18:41.933 { 00:18:41.933 "name": "pt1", 00:18:41.933 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:41.933 "is_configured": true, 00:18:41.933 "data_offset": 2048, 00:18:41.933 "data_size": 63488 00:18:41.933 }, 00:18:41.933 { 00:18:41.933 "name": "pt2", 00:18:41.933 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:41.933 "is_configured": true, 00:18:41.933 "data_offset": 2048, 00:18:41.933 "data_size": 63488 00:18:41.933 }, 00:18:41.933 { 00:18:41.933 "name": "pt3", 00:18:41.933 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:41.933 "is_configured": true, 00:18:41.933 "data_offset": 2048, 00:18:41.933 "data_size": 63488 00:18:41.933 }, 00:18:41.933 { 00:18:41.933 "name": "pt4", 00:18:41.933 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:41.933 "is_configured": true, 00:18:41.933 "data_offset": 2048, 00:18:41.933 "data_size": 63488 00:18:41.933 } 00:18:41.933 ] 00:18:41.933 } 00:18:41.933 } 00:18:41.933 }' 00:18:41.933 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:41.933 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:18:41.933 pt2 00:18:41.933 pt3 00:18:41.933 pt4' 00:18:41.933 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:41.933 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:18:41.933 13:27:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:42.503 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:42.503 "name": "pt1", 00:18:42.503 "aliases": [ 00:18:42.503 "00000000-0000-0000-0000-000000000001" 00:18:42.503 ], 00:18:42.503 "product_name": "passthru", 00:18:42.503 "block_size": 512, 00:18:42.503 "num_blocks": 65536, 00:18:42.503 "uuid": "00000000-0000-0000-0000-000000000001", 00:18:42.503 "assigned_rate_limits": { 00:18:42.503 "rw_ios_per_sec": 0, 00:18:42.503 "rw_mbytes_per_sec": 0, 00:18:42.503 "r_mbytes_per_sec": 0, 00:18:42.503 "w_mbytes_per_sec": 0 00:18:42.503 }, 00:18:42.503 "claimed": true, 00:18:42.504 "claim_type": "exclusive_write", 00:18:42.504 "zoned": false, 00:18:42.504 "supported_io_types": { 00:18:42.504 "read": true, 00:18:42.504 "write": true, 00:18:42.504 "unmap": true, 00:18:42.504 "flush": true, 00:18:42.504 "reset": true, 00:18:42.504 "nvme_admin": false, 00:18:42.504 "nvme_io": false, 00:18:42.504 "nvme_io_md": false, 00:18:42.504 "write_zeroes": true, 00:18:42.504 "zcopy": true, 00:18:42.504 "get_zone_info": false, 00:18:42.504 "zone_management": false, 00:18:42.504 "zone_append": false, 00:18:42.504 "compare": false, 00:18:42.504 "compare_and_write": false, 00:18:42.504 "abort": true, 00:18:42.504 "seek_hole": false, 00:18:42.504 "seek_data": false, 00:18:42.504 "copy": true, 00:18:42.504 "nvme_iov_md": false 00:18:42.504 }, 00:18:42.504 "memory_domains": [ 00:18:42.504 { 00:18:42.504 "dma_device_id": "system", 00:18:42.504 "dma_device_type": 1 00:18:42.504 }, 00:18:42.504 { 00:18:42.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:42.504 "dma_device_type": 2 00:18:42.504 } 00:18:42.504 ], 00:18:42.504 "driver_specific": { 00:18:42.504 "passthru": { 00:18:42.504 "name": "pt1", 00:18:42.504 "base_bdev_name": "malloc1" 00:18:42.504 } 00:18:42.504 } 00:18:42.504 }' 00:18:42.504 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:42.504 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:42.504 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:42.504 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:42.504 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:42.764 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:42.764 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:42.764 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:42.764 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:42.764 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:42.764 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:43.025 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:43.025 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:43.025 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:18:43.025 13:27:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:43.596 "name": "pt2", 00:18:43.596 "aliases": [ 00:18:43.596 "00000000-0000-0000-0000-000000000002" 00:18:43.596 ], 00:18:43.596 "product_name": "passthru", 00:18:43.596 "block_size": 512, 00:18:43.596 "num_blocks": 65536, 00:18:43.596 "uuid": "00000000-0000-0000-0000-000000000002", 00:18:43.596 "assigned_rate_limits": { 00:18:43.596 "rw_ios_per_sec": 0, 00:18:43.596 "rw_mbytes_per_sec": 0, 00:18:43.596 "r_mbytes_per_sec": 0, 00:18:43.596 "w_mbytes_per_sec": 0 00:18:43.596 }, 00:18:43.596 "claimed": true, 00:18:43.596 "claim_type": "exclusive_write", 00:18:43.596 "zoned": false, 00:18:43.596 "supported_io_types": { 00:18:43.596 "read": true, 00:18:43.596 "write": true, 00:18:43.596 "unmap": true, 00:18:43.596 "flush": true, 00:18:43.596 "reset": true, 00:18:43.596 "nvme_admin": false, 00:18:43.596 "nvme_io": false, 00:18:43.596 "nvme_io_md": false, 00:18:43.596 "write_zeroes": true, 00:18:43.596 "zcopy": true, 00:18:43.596 "get_zone_info": false, 00:18:43.596 "zone_management": false, 00:18:43.596 "zone_append": false, 00:18:43.596 "compare": false, 00:18:43.596 "compare_and_write": false, 00:18:43.596 "abort": true, 00:18:43.596 "seek_hole": false, 00:18:43.596 "seek_data": false, 00:18:43.596 "copy": true, 00:18:43.596 "nvme_iov_md": false 00:18:43.596 }, 00:18:43.596 "memory_domains": [ 00:18:43.596 { 00:18:43.596 "dma_device_id": "system", 00:18:43.596 "dma_device_type": 1 00:18:43.596 }, 00:18:43.596 { 00:18:43.596 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.596 "dma_device_type": 2 00:18:43.596 } 00:18:43.596 ], 00:18:43.596 "driver_specific": { 00:18:43.596 "passthru": { 00:18:43.596 "name": "pt2", 00:18:43.596 "base_bdev_name": "malloc2" 00:18:43.596 } 00:18:43.596 } 00:18:43.596 }' 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:43.596 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:43.857 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:43.857 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:43.857 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:43.857 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:18:43.857 13:27:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:44.427 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:44.427 "name": "pt3", 00:18:44.427 "aliases": [ 00:18:44.427 "00000000-0000-0000-0000-000000000003" 00:18:44.427 ], 00:18:44.427 "product_name": "passthru", 00:18:44.427 "block_size": 512, 00:18:44.427 "num_blocks": 65536, 00:18:44.427 "uuid": "00000000-0000-0000-0000-000000000003", 00:18:44.427 "assigned_rate_limits": { 00:18:44.427 "rw_ios_per_sec": 0, 00:18:44.427 "rw_mbytes_per_sec": 0, 00:18:44.427 "r_mbytes_per_sec": 0, 00:18:44.427 "w_mbytes_per_sec": 0 00:18:44.427 }, 00:18:44.427 "claimed": true, 00:18:44.427 "claim_type": "exclusive_write", 00:18:44.427 "zoned": false, 00:18:44.427 "supported_io_types": { 00:18:44.427 "read": true, 00:18:44.427 "write": true, 00:18:44.427 "unmap": true, 00:18:44.427 "flush": true, 00:18:44.427 "reset": true, 00:18:44.427 "nvme_admin": false, 00:18:44.427 "nvme_io": false, 00:18:44.427 "nvme_io_md": false, 00:18:44.427 "write_zeroes": true, 00:18:44.427 "zcopy": true, 00:18:44.427 "get_zone_info": false, 00:18:44.427 "zone_management": false, 00:18:44.427 "zone_append": false, 00:18:44.427 "compare": false, 00:18:44.427 "compare_and_write": false, 00:18:44.427 "abort": true, 00:18:44.427 "seek_hole": false, 00:18:44.427 "seek_data": false, 00:18:44.427 "copy": true, 00:18:44.427 "nvme_iov_md": false 00:18:44.427 }, 00:18:44.427 "memory_domains": [ 00:18:44.427 { 00:18:44.427 "dma_device_id": "system", 00:18:44.427 "dma_device_type": 1 00:18:44.427 }, 00:18:44.427 { 00:18:44.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.427 "dma_device_type": 2 00:18:44.427 } 00:18:44.427 ], 00:18:44.427 "driver_specific": { 00:18:44.427 "passthru": { 00:18:44.427 "name": "pt3", 00:18:44.427 "base_bdev_name": "malloc3" 00:18:44.427 } 00:18:44.427 } 00:18:44.427 }' 00:18:44.427 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.427 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.427 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:44.427 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:44.427 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:18:44.688 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:44.949 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:44.949 "name": "pt4", 00:18:44.949 "aliases": [ 00:18:44.949 "00000000-0000-0000-0000-000000000004" 00:18:44.949 ], 00:18:44.949 "product_name": "passthru", 00:18:44.949 "block_size": 512, 00:18:44.949 "num_blocks": 65536, 00:18:44.949 "uuid": "00000000-0000-0000-0000-000000000004", 00:18:44.949 "assigned_rate_limits": { 00:18:44.949 "rw_ios_per_sec": 0, 00:18:44.949 "rw_mbytes_per_sec": 0, 00:18:44.949 "r_mbytes_per_sec": 0, 00:18:44.949 "w_mbytes_per_sec": 0 00:18:44.949 }, 00:18:44.949 "claimed": true, 00:18:44.949 "claim_type": "exclusive_write", 00:18:44.949 "zoned": false, 00:18:44.949 "supported_io_types": { 00:18:44.949 "read": true, 00:18:44.949 "write": true, 00:18:44.949 "unmap": true, 00:18:44.949 "flush": true, 00:18:44.949 "reset": true, 00:18:44.949 "nvme_admin": false, 00:18:44.949 "nvme_io": false, 00:18:44.949 "nvme_io_md": false, 00:18:44.949 "write_zeroes": true, 00:18:44.949 "zcopy": true, 00:18:44.949 "get_zone_info": false, 00:18:44.949 "zone_management": false, 00:18:44.949 "zone_append": false, 00:18:44.949 "compare": false, 00:18:44.949 "compare_and_write": false, 00:18:44.949 "abort": true, 00:18:44.949 "seek_hole": false, 00:18:44.949 "seek_data": false, 00:18:44.949 "copy": true, 00:18:44.949 "nvme_iov_md": false 00:18:44.949 }, 00:18:44.949 "memory_domains": [ 00:18:44.949 { 00:18:44.949 "dma_device_id": "system", 00:18:44.949 "dma_device_type": 1 00:18:44.949 }, 00:18:44.949 { 00:18:44.949 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.949 "dma_device_type": 2 00:18:44.949 } 00:18:44.949 ], 00:18:44.949 "driver_specific": { 00:18:44.949 "passthru": { 00:18:44.949 "name": "pt4", 00:18:44.949 "base_bdev_name": "malloc4" 00:18:44.949 } 00:18:44.949 } 00:18:44.949 }' 00:18:44.949 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.949 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:45.209 13:27:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.469 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:45.469 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:45.469 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:18:45.469 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:18:46.040 [2024-07-25 13:27:26.593127] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 53ccac53-6fbe-4ffc-93b5-10d58e626bc0 '!=' 53ccac53-6fbe-4ffc-93b5-10d58e626bc0 ']' 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid0 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 952902 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 952902 ']' 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 952902 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 952902 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 952902' 00:18:46.040 killing process with pid 952902 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 952902 00:18:46.040 [2024-07-25 13:27:26.678080] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:46.040 [2024-07-25 13:27:26.678123] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 952902 00:18:46.040 [2024-07-25 13:27:26.678169] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:46.040 [2024-07-25 13:27:26.678175] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x136bd00 name raid_bdev1, state offline 00:18:46.040 [2024-07-25 13:27:26.698894] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:18:46.040 00:18:46.040 real 0m22.493s 00:18:46.040 user 0m41.997s 00:18:46.040 sys 0m2.764s 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:46.040 13:27:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.040 ************************************ 00:18:46.040 END TEST raid_superblock_test 00:18:46.040 ************************************ 00:18:46.302 13:27:26 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:18:46.302 13:27:26 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:46.302 13:27:26 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:46.302 13:27:26 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:46.302 ************************************ 00:18:46.302 START TEST raid_read_error_test 00:18:46.302 ************************************ 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 read 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.8JZ5tuLD6O 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=957349 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 957349 /var/tmp/spdk-raid.sock 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 957349 ']' 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:46.302 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:46.302 13:27:26 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:46.302 [2024-07-25 13:27:27.003036] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:18:46.302 [2024-07-25 13:27:27.003175] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid957349 ] 00:18:46.562 [2024-07-25 13:27:27.147128] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:46.562 [2024-07-25 13:27:27.222481] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:46.562 [2024-07-25 13:27:27.265989] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:46.562 [2024-07-25 13:27:27.266015] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:47.504 13:27:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:47.504 13:27:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:47.504 13:27:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:47.504 13:27:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:48.074 BaseBdev1_malloc 00:18:48.074 13:27:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:48.643 true 00:18:48.643 13:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:49.214 [2024-07-25 13:27:29.748731] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:49.214 [2024-07-25 13:27:29.748768] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:49.214 [2024-07-25 13:27:29.748781] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x22982a0 00:18:49.214 [2024-07-25 13:27:29.748788] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:49.214 [2024-07-25 13:27:29.750109] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:49.214 [2024-07-25 13:27:29.750130] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:49.214 BaseBdev1 00:18:49.214 13:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:49.214 13:27:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:49.785 BaseBdev2_malloc 00:18:49.785 13:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:50.355 true 00:18:50.355 13:27:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:18:50.615 [2024-07-25 13:27:31.386750] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:18:50.615 [2024-07-25 13:27:31.386781] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:50.615 [2024-07-25 13:27:31.386793] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2357420 00:18:50.615 [2024-07-25 13:27:31.386799] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:50.615 [2024-07-25 13:27:31.388011] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:50.615 [2024-07-25 13:27:31.388030] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:18:50.615 BaseBdev2 00:18:50.899 13:27:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:50.899 13:27:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:18:51.188 BaseBdev3_malloc 00:18:51.188 13:27:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:18:51.758 true 00:18:51.758 13:27:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:18:52.327 [2024-07-25 13:27:33.012729] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:18:52.327 [2024-07-25 13:27:33.012762] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:52.327 [2024-07-25 13:27:33.012776] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2358f70 00:18:52.327 [2024-07-25 13:27:33.012782] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:52.327 [2024-07-25 13:27:33.013981] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:52.328 [2024-07-25 13:27:33.014000] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:18:52.328 BaseBdev3 00:18:52.328 13:27:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:52.328 13:27:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:18:52.896 BaseBdev4_malloc 00:18:52.896 13:27:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:18:53.466 true 00:18:53.466 13:27:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:18:54.034 [2024-07-25 13:27:34.638690] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:18:54.034 [2024-07-25 13:27:34.638721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:54.034 [2024-07-25 13:27:34.638732] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x235c1e0 00:18:54.034 [2024-07-25 13:27:34.638739] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:54.034 [2024-07-25 13:27:34.639930] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:54.034 [2024-07-25 13:27:34.639949] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:18:54.034 BaseBdev4 00:18:54.034 13:27:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:18:54.604 [2024-07-25 13:27:35.180069] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:54.604 [2024-07-25 13:27:35.181090] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:54.604 [2024-07-25 13:27:35.181145] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:54.604 [2024-07-25 13:27:35.181191] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:54.604 [2024-07-25 13:27:35.181357] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x235c800 00:18:54.604 [2024-07-25 13:27:35.181364] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:54.604 [2024-07-25 13:27:35.181521] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2357950 00:18:54.604 [2024-07-25 13:27:35.181645] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x235c800 00:18:54.604 [2024-07-25 13:27:35.181652] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x235c800 00:18:54.604 [2024-07-25 13:27:35.181741] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.604 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:55.173 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:55.173 "name": "raid_bdev1", 00:18:55.173 "uuid": "d54e1606-0462-4fb4-ae82-e31bc9e6352f", 00:18:55.173 "strip_size_kb": 64, 00:18:55.173 "state": "online", 00:18:55.173 "raid_level": "raid0", 00:18:55.173 "superblock": true, 00:18:55.173 "num_base_bdevs": 4, 00:18:55.173 "num_base_bdevs_discovered": 4, 00:18:55.173 "num_base_bdevs_operational": 4, 00:18:55.173 "base_bdevs_list": [ 00:18:55.173 { 00:18:55.173 "name": "BaseBdev1", 00:18:55.173 "uuid": "b26cc146-e8ec-533e-8bf1-f7c86aecd481", 00:18:55.173 "is_configured": true, 00:18:55.173 "data_offset": 2048, 00:18:55.173 "data_size": 63488 00:18:55.173 }, 00:18:55.173 { 00:18:55.173 "name": "BaseBdev2", 00:18:55.173 "uuid": "0a894ef7-0498-5361-ab57-76fbb109347d", 00:18:55.173 "is_configured": true, 00:18:55.173 "data_offset": 2048, 00:18:55.173 "data_size": 63488 00:18:55.173 }, 00:18:55.173 { 00:18:55.173 "name": "BaseBdev3", 00:18:55.173 "uuid": "e30c36e6-26c1-5688-b4fe-60f4168c41bc", 00:18:55.173 "is_configured": true, 00:18:55.173 "data_offset": 2048, 00:18:55.173 "data_size": 63488 00:18:55.173 }, 00:18:55.173 { 00:18:55.173 "name": "BaseBdev4", 00:18:55.173 "uuid": "cbe118d3-4da1-5c61-98db-eadf7ad2a70c", 00:18:55.173 "is_configured": true, 00:18:55.173 "data_offset": 2048, 00:18:55.173 "data_size": 63488 00:18:55.173 } 00:18:55.173 ] 00:18:55.173 }' 00:18:55.173 13:27:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:55.173 13:27:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:55.743 13:27:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:18:55.743 13:27:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:18:55.743 [2024-07-25 13:27:36.399375] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x235d0e0 00:18:56.682 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.941 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:18:56.942 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.942 "name": "raid_bdev1", 00:18:56.942 "uuid": "d54e1606-0462-4fb4-ae82-e31bc9e6352f", 00:18:56.942 "strip_size_kb": 64, 00:18:56.942 "state": "online", 00:18:56.942 "raid_level": "raid0", 00:18:56.942 "superblock": true, 00:18:56.942 "num_base_bdevs": 4, 00:18:56.942 "num_base_bdevs_discovered": 4, 00:18:56.942 "num_base_bdevs_operational": 4, 00:18:56.942 "base_bdevs_list": [ 00:18:56.942 { 00:18:56.942 "name": "BaseBdev1", 00:18:56.942 "uuid": "b26cc146-e8ec-533e-8bf1-f7c86aecd481", 00:18:56.942 "is_configured": true, 00:18:56.942 "data_offset": 2048, 00:18:56.942 "data_size": 63488 00:18:56.942 }, 00:18:56.942 { 00:18:56.942 "name": "BaseBdev2", 00:18:56.942 "uuid": "0a894ef7-0498-5361-ab57-76fbb109347d", 00:18:56.942 "is_configured": true, 00:18:56.942 "data_offset": 2048, 00:18:56.942 "data_size": 63488 00:18:56.942 }, 00:18:56.942 { 00:18:56.942 "name": "BaseBdev3", 00:18:56.942 "uuid": "e30c36e6-26c1-5688-b4fe-60f4168c41bc", 00:18:56.942 "is_configured": true, 00:18:56.942 "data_offset": 2048, 00:18:56.942 "data_size": 63488 00:18:56.942 }, 00:18:56.942 { 00:18:56.942 "name": "BaseBdev4", 00:18:56.942 "uuid": "cbe118d3-4da1-5c61-98db-eadf7ad2a70c", 00:18:56.942 "is_configured": true, 00:18:56.942 "data_offset": 2048, 00:18:56.942 "data_size": 63488 00:18:56.942 } 00:18:56.942 ] 00:18:56.942 }' 00:18:56.942 13:27:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.942 13:27:37 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:57.510 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:18:57.770 [2024-07-25 13:27:38.404876] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:18:57.770 [2024-07-25 13:27:38.404905] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:57.770 [2024-07-25 13:27:38.407485] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:57.770 [2024-07-25 13:27:38.407514] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:57.770 [2024-07-25 13:27:38.407542] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:57.770 [2024-07-25 13:27:38.407551] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x235c800 name raid_bdev1, state offline 00:18:57.770 0 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 957349 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 957349 ']' 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 957349 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 957349 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 957349' 00:18:57.770 killing process with pid 957349 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 957349 00:18:57.770 [2024-07-25 13:27:38.485886] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:57.770 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 957349 00:18:57.770 [2024-07-25 13:27:38.502975] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.8JZ5tuLD6O 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:18:58.031 00:18:58.031 real 0m11.744s 00:18:58.031 user 0m20.347s 00:18:58.031 sys 0m1.475s 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:18:58.031 13:27:38 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.031 ************************************ 00:18:58.031 END TEST raid_read_error_test 00:18:58.031 ************************************ 00:18:58.031 13:27:38 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:18:58.031 13:27:38 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:18:58.031 13:27:38 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:18:58.031 13:27:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:58.031 ************************************ 00:18:58.031 START TEST raid_write_error_test 00:18:58.031 ************************************ 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid0 4 write 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid0 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:58.031 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid0 '!=' raid1 ']' 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.TWxY1u0hrz 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=959305 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 959305 /var/tmp/spdk-raid.sock 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 959305 ']' 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:58.032 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:18:58.032 13:27:38 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:18:58.032 [2024-07-25 13:27:38.780988] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:18:58.032 [2024-07-25 13:27:38.781045] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid959305 ] 00:18:58.292 [2024-07-25 13:27:38.873282] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:58.292 [2024-07-25 13:27:38.944467] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:18:58.292 [2024-07-25 13:27:38.989665] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:58.292 [2024-07-25 13:27:38.989692] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:58.860 13:27:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:18:58.860 13:27:39 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:18:58.860 13:27:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:58.860 13:27:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:18:59.120 BaseBdev1_malloc 00:18:59.120 13:27:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:18:59.380 true 00:18:59.380 13:27:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:18:59.380 [2024-07-25 13:27:40.152405] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:18:59.380 [2024-07-25 13:27:40.152447] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:18:59.380 [2024-07-25 13:27:40.152459] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x135a2a0 00:18:59.380 [2024-07-25 13:27:40.152466] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:18:59.380 [2024-07-25 13:27:40.153830] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:18:59.380 [2024-07-25 13:27:40.153850] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:18:59.380 BaseBdev1 00:18:59.640 13:27:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:18:59.640 13:27:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:18:59.640 BaseBdev2_malloc 00:18:59.640 13:27:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:18:59.901 true 00:18:59.901 13:27:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:00.161 [2024-07-25 13:27:40.735427] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:00.161 [2024-07-25 13:27:40.735454] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.161 [2024-07-25 13:27:40.735465] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1419420 00:19:00.161 [2024-07-25 13:27:40.735472] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.161 [2024-07-25 13:27:40.736669] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.161 [2024-07-25 13:27:40.736688] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:00.161 BaseBdev2 00:19:00.161 13:27:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:00.161 13:27:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:00.161 BaseBdev3_malloc 00:19:00.161 13:27:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:00.421 true 00:19:00.421 13:27:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:00.681 [2024-07-25 13:27:41.310597] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:00.681 [2024-07-25 13:27:41.310629] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:00.681 [2024-07-25 13:27:41.310647] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141af70 00:19:00.681 [2024-07-25 13:27:41.310654] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:00.681 [2024-07-25 13:27:41.311873] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:00.681 [2024-07-25 13:27:41.311892] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:00.681 BaseBdev3 00:19:00.681 13:27:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:19:00.681 13:27:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:00.942 BaseBdev4_malloc 00:19:00.942 13:27:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:00.942 true 00:19:00.942 13:27:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:01.203 [2024-07-25 13:27:41.881611] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:01.203 [2024-07-25 13:27:41.881638] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:01.203 [2024-07-25 13:27:41.881647] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141e1e0 00:19:01.203 [2024-07-25 13:27:41.881654] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:01.203 [2024-07-25 13:27:41.882846] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:01.203 [2024-07-25 13:27:41.882863] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:01.203 BaseBdev4 00:19:01.203 13:27:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:01.463 [2024-07-25 13:27:42.074126] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:01.463 [2024-07-25 13:27:42.075102] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:01.463 [2024-07-25 13:27:42.075156] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:01.463 [2024-07-25 13:27:42.075201] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:01.463 [2024-07-25 13:27:42.075364] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x141e800 00:19:01.464 [2024-07-25 13:27:42.075371] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:01.464 [2024-07-25 13:27:42.075515] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1419950 00:19:01.464 [2024-07-25 13:27:42.075635] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x141e800 00:19:01.464 [2024-07-25 13:27:42.075641] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x141e800 00:19:01.464 [2024-07-25 13:27:42.075724] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:01.464 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:01.723 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.723 "name": "raid_bdev1", 00:19:01.723 "uuid": "eeb622d4-5d6d-4042-9c6c-bb63937671ec", 00:19:01.723 "strip_size_kb": 64, 00:19:01.723 "state": "online", 00:19:01.723 "raid_level": "raid0", 00:19:01.723 "superblock": true, 00:19:01.724 "num_base_bdevs": 4, 00:19:01.724 "num_base_bdevs_discovered": 4, 00:19:01.724 "num_base_bdevs_operational": 4, 00:19:01.724 "base_bdevs_list": [ 00:19:01.724 { 00:19:01.724 "name": "BaseBdev1", 00:19:01.724 "uuid": "d07568c2-158a-5548-8bae-816f6a5f5da2", 00:19:01.724 "is_configured": true, 00:19:01.724 "data_offset": 2048, 00:19:01.724 "data_size": 63488 00:19:01.724 }, 00:19:01.724 { 00:19:01.724 "name": "BaseBdev2", 00:19:01.724 "uuid": "bb25bb68-41ad-5226-86f9-5d84623e68c7", 00:19:01.724 "is_configured": true, 00:19:01.724 "data_offset": 2048, 00:19:01.724 "data_size": 63488 00:19:01.724 }, 00:19:01.724 { 00:19:01.724 "name": "BaseBdev3", 00:19:01.724 "uuid": "8f00449a-67c8-5ac1-893a-32e18fa1b428", 00:19:01.724 "is_configured": true, 00:19:01.724 "data_offset": 2048, 00:19:01.724 "data_size": 63488 00:19:01.724 }, 00:19:01.724 { 00:19:01.724 "name": "BaseBdev4", 00:19:01.724 "uuid": "f7ddb082-e5e9-549c-be5b-781a973d277a", 00:19:01.724 "is_configured": true, 00:19:01.724 "data_offset": 2048, 00:19:01.724 "data_size": 63488 00:19:01.724 } 00:19:01.724 ] 00:19:01.724 }' 00:19:01.724 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.724 13:27:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:02.295 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:19:02.295 13:27:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:02.295 [2024-07-25 13:27:42.928525] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x141f0e0 00:19:03.236 13:27:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:03.495 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:19:03.495 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid0 = \r\a\i\d\1 ]] 00:19:03.495 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:19:03.495 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:19:03.495 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:03.496 "name": "raid_bdev1", 00:19:03.496 "uuid": "eeb622d4-5d6d-4042-9c6c-bb63937671ec", 00:19:03.496 "strip_size_kb": 64, 00:19:03.496 "state": "online", 00:19:03.496 "raid_level": "raid0", 00:19:03.496 "superblock": true, 00:19:03.496 "num_base_bdevs": 4, 00:19:03.496 "num_base_bdevs_discovered": 4, 00:19:03.496 "num_base_bdevs_operational": 4, 00:19:03.496 "base_bdevs_list": [ 00:19:03.496 { 00:19:03.496 "name": "BaseBdev1", 00:19:03.496 "uuid": "d07568c2-158a-5548-8bae-816f6a5f5da2", 00:19:03.496 "is_configured": true, 00:19:03.496 "data_offset": 2048, 00:19:03.496 "data_size": 63488 00:19:03.496 }, 00:19:03.496 { 00:19:03.496 "name": "BaseBdev2", 00:19:03.496 "uuid": "bb25bb68-41ad-5226-86f9-5d84623e68c7", 00:19:03.496 "is_configured": true, 00:19:03.496 "data_offset": 2048, 00:19:03.496 "data_size": 63488 00:19:03.496 }, 00:19:03.496 { 00:19:03.496 "name": "BaseBdev3", 00:19:03.496 "uuid": "8f00449a-67c8-5ac1-893a-32e18fa1b428", 00:19:03.496 "is_configured": true, 00:19:03.496 "data_offset": 2048, 00:19:03.496 "data_size": 63488 00:19:03.496 }, 00:19:03.496 { 00:19:03.496 "name": "BaseBdev4", 00:19:03.496 "uuid": "f7ddb082-e5e9-549c-be5b-781a973d277a", 00:19:03.496 "is_configured": true, 00:19:03.496 "data_offset": 2048, 00:19:03.496 "data_size": 63488 00:19:03.496 } 00:19:03.496 ] 00:19:03.496 }' 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:03.496 13:27:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.066 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:04.326 [2024-07-25 13:27:44.951468] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:04.326 [2024-07-25 13:27:44.951498] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:04.326 [2024-07-25 13:27:44.954190] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:04.326 [2024-07-25 13:27:44.954219] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:04.326 [2024-07-25 13:27:44.954247] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:04.326 [2024-07-25 13:27:44.954252] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141e800 name raid_bdev1, state offline 00:19:04.326 0 00:19:04.326 13:27:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 959305 00:19:04.326 13:27:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 959305 ']' 00:19:04.326 13:27:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 959305 00:19:04.326 13:27:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:19:04.326 13:27:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:04.326 13:27:44 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 959305 00:19:04.326 13:27:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:04.326 13:27:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:04.326 13:27:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 959305' 00:19:04.326 killing process with pid 959305 00:19:04.326 13:27:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 959305 00:19:04.326 [2024-07-25 13:27:45.020885] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:04.326 13:27:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 959305 00:19:04.326 [2024-07-25 13:27:45.038143] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.TWxY1u0hrz 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid0 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:19:04.588 00:19:04.588 real 0m6.464s 00:19:04.588 user 0m10.435s 00:19:04.588 sys 0m0.889s 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:04.588 13:27:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.588 ************************************ 00:19:04.588 END TEST raid_write_error_test 00:19:04.588 ************************************ 00:19:04.588 13:27:45 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:19:04.588 13:27:45 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:19:04.589 13:27:45 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:19:04.589 13:27:45 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:04.589 13:27:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:04.589 ************************************ 00:19:04.589 START TEST raid_state_function_test 00:19:04.589 ************************************ 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 false 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=960601 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 960601' 00:19:04.589 Process raid pid: 960601 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 960601 /var/tmp/spdk-raid.sock 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 960601 ']' 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:04.589 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:04.589 13:27:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:04.589 [2024-07-25 13:27:45.314090] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:19:04.589 [2024-07-25 13:27:45.314137] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:04.850 [2024-07-25 13:27:45.401871] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:04.850 [2024-07-25 13:27:45.465532] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:04.850 [2024-07-25 13:27:45.505416] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:04.850 [2024-07-25 13:27:45.505438] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:05.419 13:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:05.419 13:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:19:05.419 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:05.679 [2024-07-25 13:27:46.324525] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:05.679 [2024-07-25 13:27:46.324560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:05.679 [2024-07-25 13:27:46.324567] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:05.679 [2024-07-25 13:27:46.324573] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:05.679 [2024-07-25 13:27:46.324577] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:05.679 [2024-07-25 13:27:46.324583] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:05.679 [2024-07-25 13:27:46.324588] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:05.679 [2024-07-25 13:27:46.324593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:05.679 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:05.939 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:05.939 "name": "Existed_Raid", 00:19:05.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.939 "strip_size_kb": 64, 00:19:05.939 "state": "configuring", 00:19:05.939 "raid_level": "concat", 00:19:05.939 "superblock": false, 00:19:05.939 "num_base_bdevs": 4, 00:19:05.939 "num_base_bdevs_discovered": 0, 00:19:05.939 "num_base_bdevs_operational": 4, 00:19:05.939 "base_bdevs_list": [ 00:19:05.939 { 00:19:05.939 "name": "BaseBdev1", 00:19:05.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.939 "is_configured": false, 00:19:05.939 "data_offset": 0, 00:19:05.939 "data_size": 0 00:19:05.939 }, 00:19:05.939 { 00:19:05.939 "name": "BaseBdev2", 00:19:05.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.939 "is_configured": false, 00:19:05.939 "data_offset": 0, 00:19:05.939 "data_size": 0 00:19:05.939 }, 00:19:05.939 { 00:19:05.939 "name": "BaseBdev3", 00:19:05.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.939 "is_configured": false, 00:19:05.939 "data_offset": 0, 00:19:05.939 "data_size": 0 00:19:05.939 }, 00:19:05.939 { 00:19:05.939 "name": "BaseBdev4", 00:19:05.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:05.939 "is_configured": false, 00:19:05.939 "data_offset": 0, 00:19:05.939 "data_size": 0 00:19:05.939 } 00:19:05.939 ] 00:19:05.939 }' 00:19:05.939 13:27:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:05.939 13:27:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:06.509 13:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:06.509 [2024-07-25 13:27:47.290861] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:06.509 [2024-07-25 13:27:47.290886] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12db6f0 name Existed_Raid, state configuring 00:19:06.769 13:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:06.769 [2024-07-25 13:27:47.463305] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:06.769 [2024-07-25 13:27:47.463324] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:06.769 [2024-07-25 13:27:47.463329] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:06.769 [2024-07-25 13:27:47.463336] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:06.769 [2024-07-25 13:27:47.463340] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:06.769 [2024-07-25 13:27:47.463345] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:06.769 [2024-07-25 13:27:47.463350] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:06.769 [2024-07-25 13:27:47.463355] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:06.769 13:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:07.029 [2024-07-25 13:27:47.642473] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:07.029 BaseBdev1 00:19:07.029 13:27:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:07.029 13:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:07.029 13:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:07.029 13:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:07.029 13:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:07.029 13:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:07.029 13:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:07.288 13:27:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:07.288 [ 00:19:07.288 { 00:19:07.288 "name": "BaseBdev1", 00:19:07.288 "aliases": [ 00:19:07.288 "358ac6a6-a1df-46d2-a621-88dd19709948" 00:19:07.288 ], 00:19:07.288 "product_name": "Malloc disk", 00:19:07.288 "block_size": 512, 00:19:07.288 "num_blocks": 65536, 00:19:07.288 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:07.288 "assigned_rate_limits": { 00:19:07.288 "rw_ios_per_sec": 0, 00:19:07.288 "rw_mbytes_per_sec": 0, 00:19:07.288 "r_mbytes_per_sec": 0, 00:19:07.288 "w_mbytes_per_sec": 0 00:19:07.288 }, 00:19:07.288 "claimed": true, 00:19:07.288 "claim_type": "exclusive_write", 00:19:07.288 "zoned": false, 00:19:07.288 "supported_io_types": { 00:19:07.288 "read": true, 00:19:07.288 "write": true, 00:19:07.288 "unmap": true, 00:19:07.288 "flush": true, 00:19:07.288 "reset": true, 00:19:07.288 "nvme_admin": false, 00:19:07.288 "nvme_io": false, 00:19:07.288 "nvme_io_md": false, 00:19:07.288 "write_zeroes": true, 00:19:07.288 "zcopy": true, 00:19:07.288 "get_zone_info": false, 00:19:07.288 "zone_management": false, 00:19:07.288 "zone_append": false, 00:19:07.288 "compare": false, 00:19:07.288 "compare_and_write": false, 00:19:07.288 "abort": true, 00:19:07.288 "seek_hole": false, 00:19:07.288 "seek_data": false, 00:19:07.288 "copy": true, 00:19:07.288 "nvme_iov_md": false 00:19:07.288 }, 00:19:07.288 "memory_domains": [ 00:19:07.288 { 00:19:07.288 "dma_device_id": "system", 00:19:07.288 "dma_device_type": 1 00:19:07.288 }, 00:19:07.288 { 00:19:07.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:07.288 "dma_device_type": 2 00:19:07.288 } 00:19:07.288 ], 00:19:07.288 "driver_specific": {} 00:19:07.288 } 00:19:07.288 ] 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:07.288 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:07.548 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:07.548 "name": "Existed_Raid", 00:19:07.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.548 "strip_size_kb": 64, 00:19:07.548 "state": "configuring", 00:19:07.548 "raid_level": "concat", 00:19:07.548 "superblock": false, 00:19:07.548 "num_base_bdevs": 4, 00:19:07.548 "num_base_bdevs_discovered": 1, 00:19:07.548 "num_base_bdevs_operational": 4, 00:19:07.548 "base_bdevs_list": [ 00:19:07.548 { 00:19:07.548 "name": "BaseBdev1", 00:19:07.548 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:07.548 "is_configured": true, 00:19:07.548 "data_offset": 0, 00:19:07.548 "data_size": 65536 00:19:07.548 }, 00:19:07.548 { 00:19:07.548 "name": "BaseBdev2", 00:19:07.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.548 "is_configured": false, 00:19:07.548 "data_offset": 0, 00:19:07.548 "data_size": 0 00:19:07.548 }, 00:19:07.548 { 00:19:07.548 "name": "BaseBdev3", 00:19:07.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.548 "is_configured": false, 00:19:07.548 "data_offset": 0, 00:19:07.548 "data_size": 0 00:19:07.548 }, 00:19:07.548 { 00:19:07.548 "name": "BaseBdev4", 00:19:07.548 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:07.548 "is_configured": false, 00:19:07.548 "data_offset": 0, 00:19:07.548 "data_size": 0 00:19:07.548 } 00:19:07.548 ] 00:19:07.548 }' 00:19:07.548 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:07.548 13:27:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:08.117 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:08.377 [2024-07-25 13:27:48.953784] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:08.377 [2024-07-25 13:27:48.953814] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12daf60 name Existed_Raid, state configuring 00:19:08.377 13:27:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:08.377 [2024-07-25 13:27:49.142296] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:08.377 [2024-07-25 13:27:49.143590] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:08.377 [2024-07-25 13:27:49.143615] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:08.377 [2024-07-25 13:27:49.143621] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:08.377 [2024-07-25 13:27:49.143627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:08.377 [2024-07-25 13:27:49.143632] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:08.377 [2024-07-25 13:27:49.143637] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.377 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:08.637 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.637 "name": "Existed_Raid", 00:19:08.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.637 "strip_size_kb": 64, 00:19:08.637 "state": "configuring", 00:19:08.637 "raid_level": "concat", 00:19:08.637 "superblock": false, 00:19:08.637 "num_base_bdevs": 4, 00:19:08.637 "num_base_bdevs_discovered": 1, 00:19:08.637 "num_base_bdevs_operational": 4, 00:19:08.637 "base_bdevs_list": [ 00:19:08.637 { 00:19:08.637 "name": "BaseBdev1", 00:19:08.637 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:08.637 "is_configured": true, 00:19:08.637 "data_offset": 0, 00:19:08.637 "data_size": 65536 00:19:08.637 }, 00:19:08.637 { 00:19:08.637 "name": "BaseBdev2", 00:19:08.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.637 "is_configured": false, 00:19:08.637 "data_offset": 0, 00:19:08.637 "data_size": 0 00:19:08.637 }, 00:19:08.637 { 00:19:08.637 "name": "BaseBdev3", 00:19:08.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.637 "is_configured": false, 00:19:08.637 "data_offset": 0, 00:19:08.637 "data_size": 0 00:19:08.637 }, 00:19:08.637 { 00:19:08.637 "name": "BaseBdev4", 00:19:08.637 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:08.637 "is_configured": false, 00:19:08.637 "data_offset": 0, 00:19:08.637 "data_size": 0 00:19:08.637 } 00:19:08.637 ] 00:19:08.637 }' 00:19:08.637 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.637 13:27:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.206 13:27:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:09.466 [2024-07-25 13:27:50.085803] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:09.466 BaseBdev2 00:19:09.466 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:09.466 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:09.466 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:09.466 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:09.466 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:09.466 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:09.466 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:09.726 [ 00:19:09.726 { 00:19:09.726 "name": "BaseBdev2", 00:19:09.726 "aliases": [ 00:19:09.726 "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9" 00:19:09.726 ], 00:19:09.726 "product_name": "Malloc disk", 00:19:09.726 "block_size": 512, 00:19:09.726 "num_blocks": 65536, 00:19:09.726 "uuid": "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9", 00:19:09.726 "assigned_rate_limits": { 00:19:09.726 "rw_ios_per_sec": 0, 00:19:09.726 "rw_mbytes_per_sec": 0, 00:19:09.726 "r_mbytes_per_sec": 0, 00:19:09.726 "w_mbytes_per_sec": 0 00:19:09.726 }, 00:19:09.726 "claimed": true, 00:19:09.726 "claim_type": "exclusive_write", 00:19:09.726 "zoned": false, 00:19:09.726 "supported_io_types": { 00:19:09.726 "read": true, 00:19:09.726 "write": true, 00:19:09.726 "unmap": true, 00:19:09.726 "flush": true, 00:19:09.726 "reset": true, 00:19:09.726 "nvme_admin": false, 00:19:09.726 "nvme_io": false, 00:19:09.726 "nvme_io_md": false, 00:19:09.726 "write_zeroes": true, 00:19:09.726 "zcopy": true, 00:19:09.726 "get_zone_info": false, 00:19:09.726 "zone_management": false, 00:19:09.726 "zone_append": false, 00:19:09.726 "compare": false, 00:19:09.726 "compare_and_write": false, 00:19:09.726 "abort": true, 00:19:09.726 "seek_hole": false, 00:19:09.726 "seek_data": false, 00:19:09.726 "copy": true, 00:19:09.726 "nvme_iov_md": false 00:19:09.726 }, 00:19:09.726 "memory_domains": [ 00:19:09.726 { 00:19:09.726 "dma_device_id": "system", 00:19:09.726 "dma_device_type": 1 00:19:09.726 }, 00:19:09.726 { 00:19:09.726 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.726 "dma_device_type": 2 00:19:09.726 } 00:19:09.726 ], 00:19:09.726 "driver_specific": {} 00:19:09.726 } 00:19:09.726 ] 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:09.726 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:09.986 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:09.986 "name": "Existed_Raid", 00:19:09.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.986 "strip_size_kb": 64, 00:19:09.986 "state": "configuring", 00:19:09.986 "raid_level": "concat", 00:19:09.986 "superblock": false, 00:19:09.986 "num_base_bdevs": 4, 00:19:09.986 "num_base_bdevs_discovered": 2, 00:19:09.986 "num_base_bdevs_operational": 4, 00:19:09.986 "base_bdevs_list": [ 00:19:09.986 { 00:19:09.986 "name": "BaseBdev1", 00:19:09.986 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:09.986 "is_configured": true, 00:19:09.986 "data_offset": 0, 00:19:09.986 "data_size": 65536 00:19:09.986 }, 00:19:09.986 { 00:19:09.986 "name": "BaseBdev2", 00:19:09.986 "uuid": "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9", 00:19:09.986 "is_configured": true, 00:19:09.986 "data_offset": 0, 00:19:09.986 "data_size": 65536 00:19:09.986 }, 00:19:09.986 { 00:19:09.986 "name": "BaseBdev3", 00:19:09.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.986 "is_configured": false, 00:19:09.986 "data_offset": 0, 00:19:09.986 "data_size": 0 00:19:09.986 }, 00:19:09.986 { 00:19:09.986 "name": "BaseBdev4", 00:19:09.986 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:09.986 "is_configured": false, 00:19:09.986 "data_offset": 0, 00:19:09.986 "data_size": 0 00:19:09.986 } 00:19:09.986 ] 00:19:09.986 }' 00:19:09.986 13:27:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:09.986 13:27:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:10.556 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:10.817 [2024-07-25 13:27:51.418238] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:10.817 BaseBdev3 00:19:10.817 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:10.817 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:10.817 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:10.817 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:10.817 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:10.817 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:10.817 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:11.078 [ 00:19:11.078 { 00:19:11.078 "name": "BaseBdev3", 00:19:11.078 "aliases": [ 00:19:11.078 "57dd07ea-89fb-42f1-b678-accb711c32d3" 00:19:11.078 ], 00:19:11.078 "product_name": "Malloc disk", 00:19:11.078 "block_size": 512, 00:19:11.078 "num_blocks": 65536, 00:19:11.078 "uuid": "57dd07ea-89fb-42f1-b678-accb711c32d3", 00:19:11.078 "assigned_rate_limits": { 00:19:11.078 "rw_ios_per_sec": 0, 00:19:11.078 "rw_mbytes_per_sec": 0, 00:19:11.078 "r_mbytes_per_sec": 0, 00:19:11.078 "w_mbytes_per_sec": 0 00:19:11.078 }, 00:19:11.078 "claimed": true, 00:19:11.078 "claim_type": "exclusive_write", 00:19:11.078 "zoned": false, 00:19:11.078 "supported_io_types": { 00:19:11.078 "read": true, 00:19:11.078 "write": true, 00:19:11.078 "unmap": true, 00:19:11.078 "flush": true, 00:19:11.078 "reset": true, 00:19:11.078 "nvme_admin": false, 00:19:11.078 "nvme_io": false, 00:19:11.078 "nvme_io_md": false, 00:19:11.078 "write_zeroes": true, 00:19:11.078 "zcopy": true, 00:19:11.078 "get_zone_info": false, 00:19:11.078 "zone_management": false, 00:19:11.078 "zone_append": false, 00:19:11.078 "compare": false, 00:19:11.078 "compare_and_write": false, 00:19:11.078 "abort": true, 00:19:11.078 "seek_hole": false, 00:19:11.078 "seek_data": false, 00:19:11.078 "copy": true, 00:19:11.078 "nvme_iov_md": false 00:19:11.078 }, 00:19:11.078 "memory_domains": [ 00:19:11.078 { 00:19:11.078 "dma_device_id": "system", 00:19:11.078 "dma_device_type": 1 00:19:11.078 }, 00:19:11.078 { 00:19:11.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.078 "dma_device_type": 2 00:19:11.078 } 00:19:11.078 ], 00:19:11.078 "driver_specific": {} 00:19:11.078 } 00:19:11.078 ] 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:11.078 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:11.079 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:11.079 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:11.338 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:11.338 "name": "Existed_Raid", 00:19:11.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.338 "strip_size_kb": 64, 00:19:11.338 "state": "configuring", 00:19:11.338 "raid_level": "concat", 00:19:11.338 "superblock": false, 00:19:11.338 "num_base_bdevs": 4, 00:19:11.338 "num_base_bdevs_discovered": 3, 00:19:11.338 "num_base_bdevs_operational": 4, 00:19:11.338 "base_bdevs_list": [ 00:19:11.338 { 00:19:11.338 "name": "BaseBdev1", 00:19:11.338 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:11.338 "is_configured": true, 00:19:11.338 "data_offset": 0, 00:19:11.338 "data_size": 65536 00:19:11.338 }, 00:19:11.338 { 00:19:11.338 "name": "BaseBdev2", 00:19:11.338 "uuid": "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9", 00:19:11.338 "is_configured": true, 00:19:11.338 "data_offset": 0, 00:19:11.338 "data_size": 65536 00:19:11.338 }, 00:19:11.338 { 00:19:11.338 "name": "BaseBdev3", 00:19:11.338 "uuid": "57dd07ea-89fb-42f1-b678-accb711c32d3", 00:19:11.338 "is_configured": true, 00:19:11.338 "data_offset": 0, 00:19:11.338 "data_size": 65536 00:19:11.338 }, 00:19:11.338 { 00:19:11.338 "name": "BaseBdev4", 00:19:11.338 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:11.338 "is_configured": false, 00:19:11.338 "data_offset": 0, 00:19:11.338 "data_size": 0 00:19:11.338 } 00:19:11.338 ] 00:19:11.338 }' 00:19:11.338 13:27:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:11.338 13:27:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:11.907 [2024-07-25 13:27:52.654438] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:11.907 [2024-07-25 13:27:52.654464] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12dbfd0 00:19:11.907 [2024-07-25 13:27:52.654469] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:11.907 [2024-07-25 13:27:52.654634] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14808e0 00:19:11.907 [2024-07-25 13:27:52.654733] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12dbfd0 00:19:11.907 [2024-07-25 13:27:52.654738] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12dbfd0 00:19:11.907 [2024-07-25 13:27:52.654864] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:11.907 BaseBdev4 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:11.907 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:12.167 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:12.427 [ 00:19:12.427 { 00:19:12.427 "name": "BaseBdev4", 00:19:12.427 "aliases": [ 00:19:12.427 "dcfbaf5f-e52a-4437-9c04-4d2fce89e746" 00:19:12.427 ], 00:19:12.427 "product_name": "Malloc disk", 00:19:12.427 "block_size": 512, 00:19:12.427 "num_blocks": 65536, 00:19:12.427 "uuid": "dcfbaf5f-e52a-4437-9c04-4d2fce89e746", 00:19:12.427 "assigned_rate_limits": { 00:19:12.427 "rw_ios_per_sec": 0, 00:19:12.427 "rw_mbytes_per_sec": 0, 00:19:12.427 "r_mbytes_per_sec": 0, 00:19:12.427 "w_mbytes_per_sec": 0 00:19:12.427 }, 00:19:12.427 "claimed": true, 00:19:12.427 "claim_type": "exclusive_write", 00:19:12.427 "zoned": false, 00:19:12.427 "supported_io_types": { 00:19:12.427 "read": true, 00:19:12.427 "write": true, 00:19:12.427 "unmap": true, 00:19:12.427 "flush": true, 00:19:12.427 "reset": true, 00:19:12.427 "nvme_admin": false, 00:19:12.427 "nvme_io": false, 00:19:12.427 "nvme_io_md": false, 00:19:12.427 "write_zeroes": true, 00:19:12.427 "zcopy": true, 00:19:12.427 "get_zone_info": false, 00:19:12.427 "zone_management": false, 00:19:12.427 "zone_append": false, 00:19:12.427 "compare": false, 00:19:12.427 "compare_and_write": false, 00:19:12.427 "abort": true, 00:19:12.427 "seek_hole": false, 00:19:12.427 "seek_data": false, 00:19:12.427 "copy": true, 00:19:12.427 "nvme_iov_md": false 00:19:12.427 }, 00:19:12.427 "memory_domains": [ 00:19:12.427 { 00:19:12.427 "dma_device_id": "system", 00:19:12.427 "dma_device_type": 1 00:19:12.427 }, 00:19:12.427 { 00:19:12.427 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:12.427 "dma_device_type": 2 00:19:12.427 } 00:19:12.427 ], 00:19:12.427 "driver_specific": {} 00:19:12.427 } 00:19:12.427 ] 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.427 13:27:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:12.427 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:12.427 "name": "Existed_Raid", 00:19:12.427 "uuid": "86f70516-6457-45c9-b53d-ac012b28c689", 00:19:12.427 "strip_size_kb": 64, 00:19:12.427 "state": "online", 00:19:12.427 "raid_level": "concat", 00:19:12.427 "superblock": false, 00:19:12.427 "num_base_bdevs": 4, 00:19:12.427 "num_base_bdevs_discovered": 4, 00:19:12.427 "num_base_bdevs_operational": 4, 00:19:12.427 "base_bdevs_list": [ 00:19:12.427 { 00:19:12.427 "name": "BaseBdev1", 00:19:12.427 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:12.427 "is_configured": true, 00:19:12.427 "data_offset": 0, 00:19:12.427 "data_size": 65536 00:19:12.427 }, 00:19:12.427 { 00:19:12.427 "name": "BaseBdev2", 00:19:12.427 "uuid": "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9", 00:19:12.427 "is_configured": true, 00:19:12.427 "data_offset": 0, 00:19:12.427 "data_size": 65536 00:19:12.427 }, 00:19:12.427 { 00:19:12.427 "name": "BaseBdev3", 00:19:12.427 "uuid": "57dd07ea-89fb-42f1-b678-accb711c32d3", 00:19:12.427 "is_configured": true, 00:19:12.427 "data_offset": 0, 00:19:12.427 "data_size": 65536 00:19:12.427 }, 00:19:12.427 { 00:19:12.427 "name": "BaseBdev4", 00:19:12.427 "uuid": "dcfbaf5f-e52a-4437-9c04-4d2fce89e746", 00:19:12.427 "is_configured": true, 00:19:12.427 "data_offset": 0, 00:19:12.427 "data_size": 65536 00:19:12.427 } 00:19:12.427 ] 00:19:12.427 }' 00:19:12.427 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:12.427 13:27:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:13.003 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:13.319 [2024-07-25 13:27:53.909900] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:13.319 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:13.319 "name": "Existed_Raid", 00:19:13.319 "aliases": [ 00:19:13.319 "86f70516-6457-45c9-b53d-ac012b28c689" 00:19:13.319 ], 00:19:13.319 "product_name": "Raid Volume", 00:19:13.319 "block_size": 512, 00:19:13.319 "num_blocks": 262144, 00:19:13.319 "uuid": "86f70516-6457-45c9-b53d-ac012b28c689", 00:19:13.319 "assigned_rate_limits": { 00:19:13.319 "rw_ios_per_sec": 0, 00:19:13.319 "rw_mbytes_per_sec": 0, 00:19:13.319 "r_mbytes_per_sec": 0, 00:19:13.319 "w_mbytes_per_sec": 0 00:19:13.319 }, 00:19:13.319 "claimed": false, 00:19:13.319 "zoned": false, 00:19:13.319 "supported_io_types": { 00:19:13.319 "read": true, 00:19:13.319 "write": true, 00:19:13.319 "unmap": true, 00:19:13.319 "flush": true, 00:19:13.319 "reset": true, 00:19:13.319 "nvme_admin": false, 00:19:13.319 "nvme_io": false, 00:19:13.319 "nvme_io_md": false, 00:19:13.319 "write_zeroes": true, 00:19:13.319 "zcopy": false, 00:19:13.319 "get_zone_info": false, 00:19:13.319 "zone_management": false, 00:19:13.319 "zone_append": false, 00:19:13.319 "compare": false, 00:19:13.319 "compare_and_write": false, 00:19:13.319 "abort": false, 00:19:13.319 "seek_hole": false, 00:19:13.319 "seek_data": false, 00:19:13.319 "copy": false, 00:19:13.319 "nvme_iov_md": false 00:19:13.319 }, 00:19:13.319 "memory_domains": [ 00:19:13.319 { 00:19:13.319 "dma_device_id": "system", 00:19:13.319 "dma_device_type": 1 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.319 "dma_device_type": 2 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "dma_device_id": "system", 00:19:13.319 "dma_device_type": 1 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.319 "dma_device_type": 2 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "dma_device_id": "system", 00:19:13.319 "dma_device_type": 1 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.319 "dma_device_type": 2 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "dma_device_id": "system", 00:19:13.319 "dma_device_type": 1 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.319 "dma_device_type": 2 00:19:13.319 } 00:19:13.319 ], 00:19:13.319 "driver_specific": { 00:19:13.319 "raid": { 00:19:13.319 "uuid": "86f70516-6457-45c9-b53d-ac012b28c689", 00:19:13.319 "strip_size_kb": 64, 00:19:13.319 "state": "online", 00:19:13.319 "raid_level": "concat", 00:19:13.319 "superblock": false, 00:19:13.319 "num_base_bdevs": 4, 00:19:13.319 "num_base_bdevs_discovered": 4, 00:19:13.319 "num_base_bdevs_operational": 4, 00:19:13.319 "base_bdevs_list": [ 00:19:13.319 { 00:19:13.319 "name": "BaseBdev1", 00:19:13.319 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:13.319 "is_configured": true, 00:19:13.319 "data_offset": 0, 00:19:13.319 "data_size": 65536 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "name": "BaseBdev2", 00:19:13.319 "uuid": "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9", 00:19:13.319 "is_configured": true, 00:19:13.319 "data_offset": 0, 00:19:13.319 "data_size": 65536 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "name": "BaseBdev3", 00:19:13.319 "uuid": "57dd07ea-89fb-42f1-b678-accb711c32d3", 00:19:13.319 "is_configured": true, 00:19:13.319 "data_offset": 0, 00:19:13.319 "data_size": 65536 00:19:13.319 }, 00:19:13.319 { 00:19:13.319 "name": "BaseBdev4", 00:19:13.319 "uuid": "dcfbaf5f-e52a-4437-9c04-4d2fce89e746", 00:19:13.320 "is_configured": true, 00:19:13.320 "data_offset": 0, 00:19:13.320 "data_size": 65536 00:19:13.320 } 00:19:13.320 ] 00:19:13.320 } 00:19:13.320 } 00:19:13.320 }' 00:19:13.320 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:13.320 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:13.320 BaseBdev2 00:19:13.320 BaseBdev3 00:19:13.320 BaseBdev4' 00:19:13.320 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.320 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:13.320 13:27:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:13.579 "name": "BaseBdev1", 00:19:13.579 "aliases": [ 00:19:13.579 "358ac6a6-a1df-46d2-a621-88dd19709948" 00:19:13.579 ], 00:19:13.579 "product_name": "Malloc disk", 00:19:13.579 "block_size": 512, 00:19:13.579 "num_blocks": 65536, 00:19:13.579 "uuid": "358ac6a6-a1df-46d2-a621-88dd19709948", 00:19:13.579 "assigned_rate_limits": { 00:19:13.579 "rw_ios_per_sec": 0, 00:19:13.579 "rw_mbytes_per_sec": 0, 00:19:13.579 "r_mbytes_per_sec": 0, 00:19:13.579 "w_mbytes_per_sec": 0 00:19:13.579 }, 00:19:13.579 "claimed": true, 00:19:13.579 "claim_type": "exclusive_write", 00:19:13.579 "zoned": false, 00:19:13.579 "supported_io_types": { 00:19:13.579 "read": true, 00:19:13.579 "write": true, 00:19:13.579 "unmap": true, 00:19:13.579 "flush": true, 00:19:13.579 "reset": true, 00:19:13.579 "nvme_admin": false, 00:19:13.579 "nvme_io": false, 00:19:13.579 "nvme_io_md": false, 00:19:13.579 "write_zeroes": true, 00:19:13.579 "zcopy": true, 00:19:13.579 "get_zone_info": false, 00:19:13.579 "zone_management": false, 00:19:13.579 "zone_append": false, 00:19:13.579 "compare": false, 00:19:13.579 "compare_and_write": false, 00:19:13.579 "abort": true, 00:19:13.579 "seek_hole": false, 00:19:13.579 "seek_data": false, 00:19:13.579 "copy": true, 00:19:13.579 "nvme_iov_md": false 00:19:13.579 }, 00:19:13.579 "memory_domains": [ 00:19:13.579 { 00:19:13.579 "dma_device_id": "system", 00:19:13.579 "dma_device_type": 1 00:19:13.579 }, 00:19:13.579 { 00:19:13.579 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:13.579 "dma_device_type": 2 00:19:13.579 } 00:19:13.579 ], 00:19:13.579 "driver_specific": {} 00:19:13.579 }' 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:13.579 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:13.839 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:14.099 "name": "BaseBdev2", 00:19:14.099 "aliases": [ 00:19:14.099 "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9" 00:19:14.099 ], 00:19:14.099 "product_name": "Malloc disk", 00:19:14.099 "block_size": 512, 00:19:14.099 "num_blocks": 65536, 00:19:14.099 "uuid": "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9", 00:19:14.099 "assigned_rate_limits": { 00:19:14.099 "rw_ios_per_sec": 0, 00:19:14.099 "rw_mbytes_per_sec": 0, 00:19:14.099 "r_mbytes_per_sec": 0, 00:19:14.099 "w_mbytes_per_sec": 0 00:19:14.099 }, 00:19:14.099 "claimed": true, 00:19:14.099 "claim_type": "exclusive_write", 00:19:14.099 "zoned": false, 00:19:14.099 "supported_io_types": { 00:19:14.099 "read": true, 00:19:14.099 "write": true, 00:19:14.099 "unmap": true, 00:19:14.099 "flush": true, 00:19:14.099 "reset": true, 00:19:14.099 "nvme_admin": false, 00:19:14.099 "nvme_io": false, 00:19:14.099 "nvme_io_md": false, 00:19:14.099 "write_zeroes": true, 00:19:14.099 "zcopy": true, 00:19:14.099 "get_zone_info": false, 00:19:14.099 "zone_management": false, 00:19:14.099 "zone_append": false, 00:19:14.099 "compare": false, 00:19:14.099 "compare_and_write": false, 00:19:14.099 "abort": true, 00:19:14.099 "seek_hole": false, 00:19:14.099 "seek_data": false, 00:19:14.099 "copy": true, 00:19:14.099 "nvme_iov_md": false 00:19:14.099 }, 00:19:14.099 "memory_domains": [ 00:19:14.099 { 00:19:14.099 "dma_device_id": "system", 00:19:14.099 "dma_device_type": 1 00:19:14.099 }, 00:19:14.099 { 00:19:14.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.099 "dma_device_type": 2 00:19:14.099 } 00:19:14.099 ], 00:19:14.099 "driver_specific": {} 00:19:14.099 }' 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:14.099 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.358 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.359 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:14.359 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.359 13:27:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.359 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:14.359 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:14.359 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:14.359 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:14.619 "name": "BaseBdev3", 00:19:14.619 "aliases": [ 00:19:14.619 "57dd07ea-89fb-42f1-b678-accb711c32d3" 00:19:14.619 ], 00:19:14.619 "product_name": "Malloc disk", 00:19:14.619 "block_size": 512, 00:19:14.619 "num_blocks": 65536, 00:19:14.619 "uuid": "57dd07ea-89fb-42f1-b678-accb711c32d3", 00:19:14.619 "assigned_rate_limits": { 00:19:14.619 "rw_ios_per_sec": 0, 00:19:14.619 "rw_mbytes_per_sec": 0, 00:19:14.619 "r_mbytes_per_sec": 0, 00:19:14.619 "w_mbytes_per_sec": 0 00:19:14.619 }, 00:19:14.619 "claimed": true, 00:19:14.619 "claim_type": "exclusive_write", 00:19:14.619 "zoned": false, 00:19:14.619 "supported_io_types": { 00:19:14.619 "read": true, 00:19:14.619 "write": true, 00:19:14.619 "unmap": true, 00:19:14.619 "flush": true, 00:19:14.619 "reset": true, 00:19:14.619 "nvme_admin": false, 00:19:14.619 "nvme_io": false, 00:19:14.619 "nvme_io_md": false, 00:19:14.619 "write_zeroes": true, 00:19:14.619 "zcopy": true, 00:19:14.619 "get_zone_info": false, 00:19:14.619 "zone_management": false, 00:19:14.619 "zone_append": false, 00:19:14.619 "compare": false, 00:19:14.619 "compare_and_write": false, 00:19:14.619 "abort": true, 00:19:14.619 "seek_hole": false, 00:19:14.619 "seek_data": false, 00:19:14.619 "copy": true, 00:19:14.619 "nvme_iov_md": false 00:19:14.619 }, 00:19:14.619 "memory_domains": [ 00:19:14.619 { 00:19:14.619 "dma_device_id": "system", 00:19:14.619 "dma_device_type": 1 00:19:14.619 }, 00:19:14.619 { 00:19:14.619 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:14.619 "dma_device_type": 2 00:19:14.619 } 00:19:14.619 ], 00:19:14.619 "driver_specific": {} 00:19:14.619 }' 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:14.619 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.878 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:14.879 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:14.879 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.879 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:14.879 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:14.879 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:14.879 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:14.879 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:15.138 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:15.138 "name": "BaseBdev4", 00:19:15.138 "aliases": [ 00:19:15.138 "dcfbaf5f-e52a-4437-9c04-4d2fce89e746" 00:19:15.138 ], 00:19:15.138 "product_name": "Malloc disk", 00:19:15.138 "block_size": 512, 00:19:15.138 "num_blocks": 65536, 00:19:15.138 "uuid": "dcfbaf5f-e52a-4437-9c04-4d2fce89e746", 00:19:15.138 "assigned_rate_limits": { 00:19:15.138 "rw_ios_per_sec": 0, 00:19:15.138 "rw_mbytes_per_sec": 0, 00:19:15.138 "r_mbytes_per_sec": 0, 00:19:15.138 "w_mbytes_per_sec": 0 00:19:15.138 }, 00:19:15.138 "claimed": true, 00:19:15.138 "claim_type": "exclusive_write", 00:19:15.138 "zoned": false, 00:19:15.138 "supported_io_types": { 00:19:15.138 "read": true, 00:19:15.138 "write": true, 00:19:15.138 "unmap": true, 00:19:15.138 "flush": true, 00:19:15.138 "reset": true, 00:19:15.138 "nvme_admin": false, 00:19:15.138 "nvme_io": false, 00:19:15.138 "nvme_io_md": false, 00:19:15.138 "write_zeroes": true, 00:19:15.138 "zcopy": true, 00:19:15.138 "get_zone_info": false, 00:19:15.138 "zone_management": false, 00:19:15.138 "zone_append": false, 00:19:15.138 "compare": false, 00:19:15.138 "compare_and_write": false, 00:19:15.138 "abort": true, 00:19:15.138 "seek_hole": false, 00:19:15.138 "seek_data": false, 00:19:15.138 "copy": true, 00:19:15.138 "nvme_iov_md": false 00:19:15.138 }, 00:19:15.138 "memory_domains": [ 00:19:15.138 { 00:19:15.138 "dma_device_id": "system", 00:19:15.138 "dma_device_type": 1 00:19:15.138 }, 00:19:15.138 { 00:19:15.138 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:15.138 "dma_device_type": 2 00:19:15.138 } 00:19:15.138 ], 00:19:15.138 "driver_specific": {} 00:19:15.138 }' 00:19:15.138 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:15.138 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:15.138 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:15.138 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:15.138 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:15.398 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:15.398 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:15.398 13:27:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:15.398 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:15.398 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:15.398 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:15.398 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:15.398 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:15.658 [2024-07-25 13:27:56.263620] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:15.658 [2024-07-25 13:27:56.263640] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:15.658 [2024-07-25 13:27:56.263680] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:15.658 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:15.918 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.918 "name": "Existed_Raid", 00:19:15.918 "uuid": "86f70516-6457-45c9-b53d-ac012b28c689", 00:19:15.918 "strip_size_kb": 64, 00:19:15.918 "state": "offline", 00:19:15.918 "raid_level": "concat", 00:19:15.918 "superblock": false, 00:19:15.918 "num_base_bdevs": 4, 00:19:15.918 "num_base_bdevs_discovered": 3, 00:19:15.918 "num_base_bdevs_operational": 3, 00:19:15.918 "base_bdevs_list": [ 00:19:15.918 { 00:19:15.918 "name": null, 00:19:15.918 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:15.918 "is_configured": false, 00:19:15.918 "data_offset": 0, 00:19:15.918 "data_size": 65536 00:19:15.918 }, 00:19:15.918 { 00:19:15.918 "name": "BaseBdev2", 00:19:15.918 "uuid": "24cb19b9-1a2c-4a1d-85b7-1131ec969cc9", 00:19:15.918 "is_configured": true, 00:19:15.918 "data_offset": 0, 00:19:15.918 "data_size": 65536 00:19:15.918 }, 00:19:15.918 { 00:19:15.918 "name": "BaseBdev3", 00:19:15.918 "uuid": "57dd07ea-89fb-42f1-b678-accb711c32d3", 00:19:15.918 "is_configured": true, 00:19:15.918 "data_offset": 0, 00:19:15.918 "data_size": 65536 00:19:15.918 }, 00:19:15.918 { 00:19:15.918 "name": "BaseBdev4", 00:19:15.918 "uuid": "dcfbaf5f-e52a-4437-9c04-4d2fce89e746", 00:19:15.918 "is_configured": true, 00:19:15.918 "data_offset": 0, 00:19:15.918 "data_size": 65536 00:19:15.918 } 00:19:15.918 ] 00:19:15.918 }' 00:19:15.918 13:27:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.918 13:27:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:16.485 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:16.485 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:16.485 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:16.485 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.485 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:16.485 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:16.485 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:16.744 [2024-07-25 13:27:57.426580] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:16.744 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:16.744 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:16.744 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.744 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:17.003 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:17.003 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:17.003 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:17.263 [2024-07-25 13:27:57.817386] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:17.263 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:17.263 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:17.263 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.263 13:27:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:17.263 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:17.263 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:17.263 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:17.522 [2024-07-25 13:27:58.204138] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:17.522 [2024-07-25 13:27:58.204165] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12dbfd0 name Existed_Raid, state offline 00:19:17.522 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:17.522 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:17.522 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.522 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:17.781 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:17.781 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:17.781 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:17.781 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:17.781 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:17.781 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:18.041 BaseBdev2 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.041 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:18.301 [ 00:19:18.301 { 00:19:18.301 "name": "BaseBdev2", 00:19:18.301 "aliases": [ 00:19:18.301 "c2ecae8f-a651-4f31-8d04-6244779a0b19" 00:19:18.301 ], 00:19:18.301 "product_name": "Malloc disk", 00:19:18.301 "block_size": 512, 00:19:18.301 "num_blocks": 65536, 00:19:18.301 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:18.301 "assigned_rate_limits": { 00:19:18.301 "rw_ios_per_sec": 0, 00:19:18.301 "rw_mbytes_per_sec": 0, 00:19:18.301 "r_mbytes_per_sec": 0, 00:19:18.301 "w_mbytes_per_sec": 0 00:19:18.301 }, 00:19:18.301 "claimed": false, 00:19:18.301 "zoned": false, 00:19:18.301 "supported_io_types": { 00:19:18.301 "read": true, 00:19:18.301 "write": true, 00:19:18.301 "unmap": true, 00:19:18.301 "flush": true, 00:19:18.301 "reset": true, 00:19:18.301 "nvme_admin": false, 00:19:18.301 "nvme_io": false, 00:19:18.301 "nvme_io_md": false, 00:19:18.301 "write_zeroes": true, 00:19:18.301 "zcopy": true, 00:19:18.301 "get_zone_info": false, 00:19:18.301 "zone_management": false, 00:19:18.301 "zone_append": false, 00:19:18.301 "compare": false, 00:19:18.301 "compare_and_write": false, 00:19:18.301 "abort": true, 00:19:18.301 "seek_hole": false, 00:19:18.301 "seek_data": false, 00:19:18.301 "copy": true, 00:19:18.301 "nvme_iov_md": false 00:19:18.301 }, 00:19:18.301 "memory_domains": [ 00:19:18.301 { 00:19:18.301 "dma_device_id": "system", 00:19:18.301 "dma_device_type": 1 00:19:18.301 }, 00:19:18.301 { 00:19:18.301 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.301 "dma_device_type": 2 00:19:18.301 } 00:19:18.301 ], 00:19:18.301 "driver_specific": {} 00:19:18.301 } 00:19:18.301 ] 00:19:18.301 13:27:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:18.301 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:18.301 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:18.301 13:27:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:18.559 BaseBdev3 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:18.559 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:18.818 [ 00:19:18.818 { 00:19:18.818 "name": "BaseBdev3", 00:19:18.818 "aliases": [ 00:19:18.818 "ebc9b905-2992-48a4-9737-911fb1396975" 00:19:18.818 ], 00:19:18.818 "product_name": "Malloc disk", 00:19:18.818 "block_size": 512, 00:19:18.818 "num_blocks": 65536, 00:19:18.818 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:18.818 "assigned_rate_limits": { 00:19:18.818 "rw_ios_per_sec": 0, 00:19:18.818 "rw_mbytes_per_sec": 0, 00:19:18.818 "r_mbytes_per_sec": 0, 00:19:18.818 "w_mbytes_per_sec": 0 00:19:18.818 }, 00:19:18.818 "claimed": false, 00:19:18.818 "zoned": false, 00:19:18.818 "supported_io_types": { 00:19:18.818 "read": true, 00:19:18.818 "write": true, 00:19:18.818 "unmap": true, 00:19:18.818 "flush": true, 00:19:18.818 "reset": true, 00:19:18.818 "nvme_admin": false, 00:19:18.818 "nvme_io": false, 00:19:18.818 "nvme_io_md": false, 00:19:18.818 "write_zeroes": true, 00:19:18.818 "zcopy": true, 00:19:18.818 "get_zone_info": false, 00:19:18.818 "zone_management": false, 00:19:18.818 "zone_append": false, 00:19:18.818 "compare": false, 00:19:18.818 "compare_and_write": false, 00:19:18.818 "abort": true, 00:19:18.818 "seek_hole": false, 00:19:18.818 "seek_data": false, 00:19:18.818 "copy": true, 00:19:18.818 "nvme_iov_md": false 00:19:18.818 }, 00:19:18.818 "memory_domains": [ 00:19:18.818 { 00:19:18.818 "dma_device_id": "system", 00:19:18.818 "dma_device_type": 1 00:19:18.818 }, 00:19:18.818 { 00:19:18.818 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:18.818 "dma_device_type": 2 00:19:18.818 } 00:19:18.818 ], 00:19:18.818 "driver_specific": {} 00:19:18.818 } 00:19:18.818 ] 00:19:18.818 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:18.818 13:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:18.818 13:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:18.818 13:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:19.077 BaseBdev4 00:19:19.077 13:27:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:19.077 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:19.077 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:19.077 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:19.077 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:19.077 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:19.077 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:19.336 13:27:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:19.336 [ 00:19:19.336 { 00:19:19.336 "name": "BaseBdev4", 00:19:19.336 "aliases": [ 00:19:19.336 "5e7df02a-4939-4dc8-996e-81e29323be2a" 00:19:19.336 ], 00:19:19.336 "product_name": "Malloc disk", 00:19:19.336 "block_size": 512, 00:19:19.336 "num_blocks": 65536, 00:19:19.336 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:19.336 "assigned_rate_limits": { 00:19:19.336 "rw_ios_per_sec": 0, 00:19:19.336 "rw_mbytes_per_sec": 0, 00:19:19.336 "r_mbytes_per_sec": 0, 00:19:19.336 "w_mbytes_per_sec": 0 00:19:19.336 }, 00:19:19.336 "claimed": false, 00:19:19.336 "zoned": false, 00:19:19.336 "supported_io_types": { 00:19:19.336 "read": true, 00:19:19.336 "write": true, 00:19:19.336 "unmap": true, 00:19:19.336 "flush": true, 00:19:19.336 "reset": true, 00:19:19.336 "nvme_admin": false, 00:19:19.336 "nvme_io": false, 00:19:19.336 "nvme_io_md": false, 00:19:19.336 "write_zeroes": true, 00:19:19.336 "zcopy": true, 00:19:19.336 "get_zone_info": false, 00:19:19.336 "zone_management": false, 00:19:19.336 "zone_append": false, 00:19:19.336 "compare": false, 00:19:19.336 "compare_and_write": false, 00:19:19.336 "abort": true, 00:19:19.336 "seek_hole": false, 00:19:19.336 "seek_data": false, 00:19:19.336 "copy": true, 00:19:19.336 "nvme_iov_md": false 00:19:19.336 }, 00:19:19.336 "memory_domains": [ 00:19:19.336 { 00:19:19.336 "dma_device_id": "system", 00:19:19.336 "dma_device_type": 1 00:19:19.336 }, 00:19:19.336 { 00:19:19.336 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.336 "dma_device_type": 2 00:19:19.336 } 00:19:19.336 ], 00:19:19.336 "driver_specific": {} 00:19:19.336 } 00:19:19.336 ] 00:19:19.336 13:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:19.336 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:19.336 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:19.336 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:19.595 [2024-07-25 13:28:00.243284] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:19.595 [2024-07-25 13:28:00.243313] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:19.595 [2024-07-25 13:28:00.243325] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:19.595 [2024-07-25 13:28:00.244363] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:19.595 [2024-07-25 13:28:00.244395] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:19.595 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:19.853 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:19.853 "name": "Existed_Raid", 00:19:19.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.853 "strip_size_kb": 64, 00:19:19.853 "state": "configuring", 00:19:19.853 "raid_level": "concat", 00:19:19.853 "superblock": false, 00:19:19.853 "num_base_bdevs": 4, 00:19:19.853 "num_base_bdevs_discovered": 3, 00:19:19.853 "num_base_bdevs_operational": 4, 00:19:19.853 "base_bdevs_list": [ 00:19:19.853 { 00:19:19.853 "name": "BaseBdev1", 00:19:19.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:19.853 "is_configured": false, 00:19:19.853 "data_offset": 0, 00:19:19.853 "data_size": 0 00:19:19.853 }, 00:19:19.853 { 00:19:19.853 "name": "BaseBdev2", 00:19:19.853 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:19.853 "is_configured": true, 00:19:19.853 "data_offset": 0, 00:19:19.853 "data_size": 65536 00:19:19.853 }, 00:19:19.853 { 00:19:19.853 "name": "BaseBdev3", 00:19:19.853 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:19.853 "is_configured": true, 00:19:19.853 "data_offset": 0, 00:19:19.853 "data_size": 65536 00:19:19.853 }, 00:19:19.853 { 00:19:19.853 "name": "BaseBdev4", 00:19:19.853 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:19.853 "is_configured": true, 00:19:19.853 "data_offset": 0, 00:19:19.853 "data_size": 65536 00:19:19.853 } 00:19:19.853 ] 00:19:19.853 }' 00:19:19.853 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:19.853 13:28:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:20.425 13:28:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:20.425 [2024-07-25 13:28:01.153580] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:20.425 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:20.683 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:20.683 "name": "Existed_Raid", 00:19:20.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.683 "strip_size_kb": 64, 00:19:20.683 "state": "configuring", 00:19:20.683 "raid_level": "concat", 00:19:20.683 "superblock": false, 00:19:20.683 "num_base_bdevs": 4, 00:19:20.683 "num_base_bdevs_discovered": 2, 00:19:20.683 "num_base_bdevs_operational": 4, 00:19:20.683 "base_bdevs_list": [ 00:19:20.683 { 00:19:20.683 "name": "BaseBdev1", 00:19:20.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:20.683 "is_configured": false, 00:19:20.683 "data_offset": 0, 00:19:20.683 "data_size": 0 00:19:20.683 }, 00:19:20.683 { 00:19:20.683 "name": null, 00:19:20.683 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:20.683 "is_configured": false, 00:19:20.683 "data_offset": 0, 00:19:20.683 "data_size": 65536 00:19:20.683 }, 00:19:20.683 { 00:19:20.683 "name": "BaseBdev3", 00:19:20.683 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:20.683 "is_configured": true, 00:19:20.683 "data_offset": 0, 00:19:20.683 "data_size": 65536 00:19:20.683 }, 00:19:20.683 { 00:19:20.683 "name": "BaseBdev4", 00:19:20.683 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:20.683 "is_configured": true, 00:19:20.683 "data_offset": 0, 00:19:20.683 "data_size": 65536 00:19:20.683 } 00:19:20.683 ] 00:19:20.683 }' 00:19:20.683 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:20.683 13:28:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.251 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:21.251 13:28:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:21.510 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:21.510 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:21.510 [2024-07-25 13:28:02.281366] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:21.510 BaseBdev1 00:19:21.510 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:21.510 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:21.510 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:21.510 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:21.510 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:21.511 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:21.511 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:21.771 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:22.031 [ 00:19:22.031 { 00:19:22.031 "name": "BaseBdev1", 00:19:22.031 "aliases": [ 00:19:22.031 "77f5f01d-95a2-452a-acb2-1890d66b04f0" 00:19:22.031 ], 00:19:22.031 "product_name": "Malloc disk", 00:19:22.031 "block_size": 512, 00:19:22.031 "num_blocks": 65536, 00:19:22.031 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:22.031 "assigned_rate_limits": { 00:19:22.031 "rw_ios_per_sec": 0, 00:19:22.031 "rw_mbytes_per_sec": 0, 00:19:22.031 "r_mbytes_per_sec": 0, 00:19:22.031 "w_mbytes_per_sec": 0 00:19:22.031 }, 00:19:22.031 "claimed": true, 00:19:22.031 "claim_type": "exclusive_write", 00:19:22.031 "zoned": false, 00:19:22.031 "supported_io_types": { 00:19:22.031 "read": true, 00:19:22.031 "write": true, 00:19:22.031 "unmap": true, 00:19:22.031 "flush": true, 00:19:22.031 "reset": true, 00:19:22.031 "nvme_admin": false, 00:19:22.031 "nvme_io": false, 00:19:22.031 "nvme_io_md": false, 00:19:22.031 "write_zeroes": true, 00:19:22.031 "zcopy": true, 00:19:22.031 "get_zone_info": false, 00:19:22.031 "zone_management": false, 00:19:22.031 "zone_append": false, 00:19:22.031 "compare": false, 00:19:22.031 "compare_and_write": false, 00:19:22.031 "abort": true, 00:19:22.031 "seek_hole": false, 00:19:22.031 "seek_data": false, 00:19:22.031 "copy": true, 00:19:22.031 "nvme_iov_md": false 00:19:22.031 }, 00:19:22.031 "memory_domains": [ 00:19:22.031 { 00:19:22.031 "dma_device_id": "system", 00:19:22.031 "dma_device_type": 1 00:19:22.031 }, 00:19:22.031 { 00:19:22.031 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:22.031 "dma_device_type": 2 00:19:22.031 } 00:19:22.031 ], 00:19:22.031 "driver_specific": {} 00:19:22.031 } 00:19:22.031 ] 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.031 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:22.291 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:22.291 "name": "Existed_Raid", 00:19:22.291 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:22.291 "strip_size_kb": 64, 00:19:22.291 "state": "configuring", 00:19:22.291 "raid_level": "concat", 00:19:22.291 "superblock": false, 00:19:22.291 "num_base_bdevs": 4, 00:19:22.291 "num_base_bdevs_discovered": 3, 00:19:22.291 "num_base_bdevs_operational": 4, 00:19:22.291 "base_bdevs_list": [ 00:19:22.291 { 00:19:22.291 "name": "BaseBdev1", 00:19:22.291 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:22.291 "is_configured": true, 00:19:22.291 "data_offset": 0, 00:19:22.291 "data_size": 65536 00:19:22.291 }, 00:19:22.291 { 00:19:22.291 "name": null, 00:19:22.291 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:22.291 "is_configured": false, 00:19:22.291 "data_offset": 0, 00:19:22.291 "data_size": 65536 00:19:22.291 }, 00:19:22.291 { 00:19:22.291 "name": "BaseBdev3", 00:19:22.291 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:22.291 "is_configured": true, 00:19:22.291 "data_offset": 0, 00:19:22.291 "data_size": 65536 00:19:22.291 }, 00:19:22.291 { 00:19:22.291 "name": "BaseBdev4", 00:19:22.291 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:22.291 "is_configured": true, 00:19:22.291 "data_offset": 0, 00:19:22.291 "data_size": 65536 00:19:22.291 } 00:19:22.291 ] 00:19:22.291 }' 00:19:22.291 13:28:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:22.291 13:28:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:22.860 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:22.860 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:22.860 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:22.860 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:23.118 [2024-07-25 13:28:03.801213] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.119 13:28:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:23.377 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:23.377 "name": "Existed_Raid", 00:19:23.377 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:23.377 "strip_size_kb": 64, 00:19:23.377 "state": "configuring", 00:19:23.377 "raid_level": "concat", 00:19:23.377 "superblock": false, 00:19:23.377 "num_base_bdevs": 4, 00:19:23.377 "num_base_bdevs_discovered": 2, 00:19:23.377 "num_base_bdevs_operational": 4, 00:19:23.377 "base_bdevs_list": [ 00:19:23.377 { 00:19:23.377 "name": "BaseBdev1", 00:19:23.378 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:23.378 "is_configured": true, 00:19:23.378 "data_offset": 0, 00:19:23.378 "data_size": 65536 00:19:23.378 }, 00:19:23.378 { 00:19:23.378 "name": null, 00:19:23.378 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:23.378 "is_configured": false, 00:19:23.378 "data_offset": 0, 00:19:23.378 "data_size": 65536 00:19:23.378 }, 00:19:23.378 { 00:19:23.378 "name": null, 00:19:23.378 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:23.378 "is_configured": false, 00:19:23.378 "data_offset": 0, 00:19:23.378 "data_size": 65536 00:19:23.378 }, 00:19:23.378 { 00:19:23.378 "name": "BaseBdev4", 00:19:23.378 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:23.378 "is_configured": true, 00:19:23.378 "data_offset": 0, 00:19:23.378 "data_size": 65536 00:19:23.378 } 00:19:23.378 ] 00:19:23.378 }' 00:19:23.378 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:23.378 13:28:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:23.946 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:23.946 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:24.206 [2024-07-25 13:28:04.928075] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:24.206 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:24.207 13:28:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:24.467 13:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:24.467 "name": "Existed_Raid", 00:19:24.467 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:24.467 "strip_size_kb": 64, 00:19:24.467 "state": "configuring", 00:19:24.467 "raid_level": "concat", 00:19:24.467 "superblock": false, 00:19:24.467 "num_base_bdevs": 4, 00:19:24.467 "num_base_bdevs_discovered": 3, 00:19:24.467 "num_base_bdevs_operational": 4, 00:19:24.467 "base_bdevs_list": [ 00:19:24.467 { 00:19:24.467 "name": "BaseBdev1", 00:19:24.467 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:24.467 "is_configured": true, 00:19:24.467 "data_offset": 0, 00:19:24.467 "data_size": 65536 00:19:24.467 }, 00:19:24.467 { 00:19:24.467 "name": null, 00:19:24.467 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:24.467 "is_configured": false, 00:19:24.467 "data_offset": 0, 00:19:24.467 "data_size": 65536 00:19:24.467 }, 00:19:24.467 { 00:19:24.467 "name": "BaseBdev3", 00:19:24.467 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:24.467 "is_configured": true, 00:19:24.467 "data_offset": 0, 00:19:24.467 "data_size": 65536 00:19:24.467 }, 00:19:24.467 { 00:19:24.467 "name": "BaseBdev4", 00:19:24.467 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:24.467 "is_configured": true, 00:19:24.467 "data_offset": 0, 00:19:24.467 "data_size": 65536 00:19:24.467 } 00:19:24.467 ] 00:19:24.467 }' 00:19:24.467 13:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:24.467 13:28:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:25.036 13:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.036 13:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:25.296 13:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:25.296 13:28:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:25.296 [2024-07-25 13:28:06.058950] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:25.296 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:25.555 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:25.555 "name": "Existed_Raid", 00:19:25.555 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:25.555 "strip_size_kb": 64, 00:19:25.555 "state": "configuring", 00:19:25.555 "raid_level": "concat", 00:19:25.555 "superblock": false, 00:19:25.555 "num_base_bdevs": 4, 00:19:25.555 "num_base_bdevs_discovered": 2, 00:19:25.555 "num_base_bdevs_operational": 4, 00:19:25.555 "base_bdevs_list": [ 00:19:25.555 { 00:19:25.555 "name": null, 00:19:25.556 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:25.556 "is_configured": false, 00:19:25.556 "data_offset": 0, 00:19:25.556 "data_size": 65536 00:19:25.556 }, 00:19:25.556 { 00:19:25.556 "name": null, 00:19:25.556 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:25.556 "is_configured": false, 00:19:25.556 "data_offset": 0, 00:19:25.556 "data_size": 65536 00:19:25.556 }, 00:19:25.556 { 00:19:25.556 "name": "BaseBdev3", 00:19:25.556 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:25.556 "is_configured": true, 00:19:25.556 "data_offset": 0, 00:19:25.556 "data_size": 65536 00:19:25.556 }, 00:19:25.556 { 00:19:25.556 "name": "BaseBdev4", 00:19:25.556 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:25.556 "is_configured": true, 00:19:25.556 "data_offset": 0, 00:19:25.556 "data_size": 65536 00:19:25.556 } 00:19:25.556 ] 00:19:25.556 }' 00:19:25.556 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:25.556 13:28:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:26.126 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.126 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:26.386 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:26.386 13:28:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:26.386 [2024-07-25 13:28:07.163501] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:26.645 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.645 "name": "Existed_Raid", 00:19:26.645 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:26.645 "strip_size_kb": 64, 00:19:26.645 "state": "configuring", 00:19:26.646 "raid_level": "concat", 00:19:26.646 "superblock": false, 00:19:26.646 "num_base_bdevs": 4, 00:19:26.646 "num_base_bdevs_discovered": 3, 00:19:26.646 "num_base_bdevs_operational": 4, 00:19:26.646 "base_bdevs_list": [ 00:19:26.646 { 00:19:26.646 "name": null, 00:19:26.646 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:26.646 "is_configured": false, 00:19:26.646 "data_offset": 0, 00:19:26.646 "data_size": 65536 00:19:26.646 }, 00:19:26.646 { 00:19:26.646 "name": "BaseBdev2", 00:19:26.646 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:26.646 "is_configured": true, 00:19:26.646 "data_offset": 0, 00:19:26.646 "data_size": 65536 00:19:26.646 }, 00:19:26.646 { 00:19:26.646 "name": "BaseBdev3", 00:19:26.646 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:26.646 "is_configured": true, 00:19:26.646 "data_offset": 0, 00:19:26.646 "data_size": 65536 00:19:26.646 }, 00:19:26.646 { 00:19:26.646 "name": "BaseBdev4", 00:19:26.646 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:26.646 "is_configured": true, 00:19:26.646 "data_offset": 0, 00:19:26.646 "data_size": 65536 00:19:26.646 } 00:19:26.646 ] 00:19:26.646 }' 00:19:26.646 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.646 13:28:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:27.216 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:27.216 13:28:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.476 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:27.476 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:27.476 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 77f5f01d-95a2-452a-acb2-1890d66b04f0 00:19:27.736 [2024-07-25 13:28:08.491874] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:27.736 [2024-07-25 13:28:08.491900] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x12da950 00:19:27.736 [2024-07-25 13:28:08.491905] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:19:27.736 [2024-07-25 13:28:08.492060] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1480640 00:19:27.736 [2024-07-25 13:28:08.492152] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12da950 00:19:27.736 [2024-07-25 13:28:08.492157] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12da950 00:19:27.736 [2024-07-25 13:28:08.492276] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:27.736 NewBaseBdev 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:27.736 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:27.996 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:28.256 [ 00:19:28.256 { 00:19:28.256 "name": "NewBaseBdev", 00:19:28.256 "aliases": [ 00:19:28.256 "77f5f01d-95a2-452a-acb2-1890d66b04f0" 00:19:28.256 ], 00:19:28.256 "product_name": "Malloc disk", 00:19:28.256 "block_size": 512, 00:19:28.256 "num_blocks": 65536, 00:19:28.256 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:28.256 "assigned_rate_limits": { 00:19:28.256 "rw_ios_per_sec": 0, 00:19:28.256 "rw_mbytes_per_sec": 0, 00:19:28.256 "r_mbytes_per_sec": 0, 00:19:28.256 "w_mbytes_per_sec": 0 00:19:28.256 }, 00:19:28.256 "claimed": true, 00:19:28.257 "claim_type": "exclusive_write", 00:19:28.257 "zoned": false, 00:19:28.257 "supported_io_types": { 00:19:28.257 "read": true, 00:19:28.257 "write": true, 00:19:28.257 "unmap": true, 00:19:28.257 "flush": true, 00:19:28.257 "reset": true, 00:19:28.257 "nvme_admin": false, 00:19:28.257 "nvme_io": false, 00:19:28.257 "nvme_io_md": false, 00:19:28.257 "write_zeroes": true, 00:19:28.257 "zcopy": true, 00:19:28.257 "get_zone_info": false, 00:19:28.257 "zone_management": false, 00:19:28.257 "zone_append": false, 00:19:28.257 "compare": false, 00:19:28.257 "compare_and_write": false, 00:19:28.257 "abort": true, 00:19:28.257 "seek_hole": false, 00:19:28.257 "seek_data": false, 00:19:28.257 "copy": true, 00:19:28.257 "nvme_iov_md": false 00:19:28.257 }, 00:19:28.257 "memory_domains": [ 00:19:28.257 { 00:19:28.257 "dma_device_id": "system", 00:19:28.257 "dma_device_type": 1 00:19:28.257 }, 00:19:28.257 { 00:19:28.257 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:28.257 "dma_device_type": 2 00:19:28.257 } 00:19:28.257 ], 00:19:28.257 "driver_specific": {} 00:19:28.257 } 00:19:28.257 ] 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.257 13:28:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:28.517 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.517 "name": "Existed_Raid", 00:19:28.517 "uuid": "42d27fa9-b0cb-462e-a6f1-b25f275a0e1a", 00:19:28.517 "strip_size_kb": 64, 00:19:28.517 "state": "online", 00:19:28.517 "raid_level": "concat", 00:19:28.517 "superblock": false, 00:19:28.517 "num_base_bdevs": 4, 00:19:28.517 "num_base_bdevs_discovered": 4, 00:19:28.517 "num_base_bdevs_operational": 4, 00:19:28.517 "base_bdevs_list": [ 00:19:28.517 { 00:19:28.517 "name": "NewBaseBdev", 00:19:28.517 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:28.517 "is_configured": true, 00:19:28.517 "data_offset": 0, 00:19:28.517 "data_size": 65536 00:19:28.517 }, 00:19:28.517 { 00:19:28.517 "name": "BaseBdev2", 00:19:28.517 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:28.517 "is_configured": true, 00:19:28.517 "data_offset": 0, 00:19:28.517 "data_size": 65536 00:19:28.517 }, 00:19:28.517 { 00:19:28.517 "name": "BaseBdev3", 00:19:28.517 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:28.517 "is_configured": true, 00:19:28.517 "data_offset": 0, 00:19:28.517 "data_size": 65536 00:19:28.517 }, 00:19:28.517 { 00:19:28.517 "name": "BaseBdev4", 00:19:28.517 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:28.517 "is_configured": true, 00:19:28.517 "data_offset": 0, 00:19:28.517 "data_size": 65536 00:19:28.517 } 00:19:28.517 ] 00:19:28.517 }' 00:19:28.517 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.517 13:28:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:29.087 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:29.087 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:29.087 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:29.087 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:29.087 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:29.087 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:29.087 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:29.088 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:29.088 [2024-07-25 13:28:09.779406] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:29.088 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:29.088 "name": "Existed_Raid", 00:19:29.088 "aliases": [ 00:19:29.088 "42d27fa9-b0cb-462e-a6f1-b25f275a0e1a" 00:19:29.088 ], 00:19:29.088 "product_name": "Raid Volume", 00:19:29.088 "block_size": 512, 00:19:29.088 "num_blocks": 262144, 00:19:29.088 "uuid": "42d27fa9-b0cb-462e-a6f1-b25f275a0e1a", 00:19:29.088 "assigned_rate_limits": { 00:19:29.088 "rw_ios_per_sec": 0, 00:19:29.088 "rw_mbytes_per_sec": 0, 00:19:29.088 "r_mbytes_per_sec": 0, 00:19:29.088 "w_mbytes_per_sec": 0 00:19:29.088 }, 00:19:29.088 "claimed": false, 00:19:29.088 "zoned": false, 00:19:29.088 "supported_io_types": { 00:19:29.088 "read": true, 00:19:29.088 "write": true, 00:19:29.088 "unmap": true, 00:19:29.088 "flush": true, 00:19:29.088 "reset": true, 00:19:29.088 "nvme_admin": false, 00:19:29.088 "nvme_io": false, 00:19:29.088 "nvme_io_md": false, 00:19:29.088 "write_zeroes": true, 00:19:29.088 "zcopy": false, 00:19:29.088 "get_zone_info": false, 00:19:29.088 "zone_management": false, 00:19:29.088 "zone_append": false, 00:19:29.088 "compare": false, 00:19:29.088 "compare_and_write": false, 00:19:29.088 "abort": false, 00:19:29.088 "seek_hole": false, 00:19:29.088 "seek_data": false, 00:19:29.088 "copy": false, 00:19:29.088 "nvme_iov_md": false 00:19:29.088 }, 00:19:29.088 "memory_domains": [ 00:19:29.088 { 00:19:29.088 "dma_device_id": "system", 00:19:29.088 "dma_device_type": 1 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.088 "dma_device_type": 2 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "dma_device_id": "system", 00:19:29.088 "dma_device_type": 1 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.088 "dma_device_type": 2 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "dma_device_id": "system", 00:19:29.088 "dma_device_type": 1 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.088 "dma_device_type": 2 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "dma_device_id": "system", 00:19:29.088 "dma_device_type": 1 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.088 "dma_device_type": 2 00:19:29.088 } 00:19:29.088 ], 00:19:29.088 "driver_specific": { 00:19:29.088 "raid": { 00:19:29.088 "uuid": "42d27fa9-b0cb-462e-a6f1-b25f275a0e1a", 00:19:29.088 "strip_size_kb": 64, 00:19:29.088 "state": "online", 00:19:29.088 "raid_level": "concat", 00:19:29.088 "superblock": false, 00:19:29.088 "num_base_bdevs": 4, 00:19:29.088 "num_base_bdevs_discovered": 4, 00:19:29.088 "num_base_bdevs_operational": 4, 00:19:29.088 "base_bdevs_list": [ 00:19:29.088 { 00:19:29.088 "name": "NewBaseBdev", 00:19:29.088 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:29.088 "is_configured": true, 00:19:29.088 "data_offset": 0, 00:19:29.088 "data_size": 65536 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "name": "BaseBdev2", 00:19:29.088 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:29.088 "is_configured": true, 00:19:29.088 "data_offset": 0, 00:19:29.088 "data_size": 65536 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "name": "BaseBdev3", 00:19:29.088 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:29.088 "is_configured": true, 00:19:29.088 "data_offset": 0, 00:19:29.088 "data_size": 65536 00:19:29.088 }, 00:19:29.088 { 00:19:29.088 "name": "BaseBdev4", 00:19:29.088 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:29.088 "is_configured": true, 00:19:29.088 "data_offset": 0, 00:19:29.088 "data_size": 65536 00:19:29.088 } 00:19:29.088 ] 00:19:29.088 } 00:19:29.088 } 00:19:29.088 }' 00:19:29.088 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:29.088 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:29.088 BaseBdev2 00:19:29.088 BaseBdev3 00:19:29.088 BaseBdev4' 00:19:29.088 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.088 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:29.088 13:28:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.348 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.348 "name": "NewBaseBdev", 00:19:29.348 "aliases": [ 00:19:29.348 "77f5f01d-95a2-452a-acb2-1890d66b04f0" 00:19:29.348 ], 00:19:29.348 "product_name": "Malloc disk", 00:19:29.348 "block_size": 512, 00:19:29.348 "num_blocks": 65536, 00:19:29.348 "uuid": "77f5f01d-95a2-452a-acb2-1890d66b04f0", 00:19:29.348 "assigned_rate_limits": { 00:19:29.348 "rw_ios_per_sec": 0, 00:19:29.348 "rw_mbytes_per_sec": 0, 00:19:29.349 "r_mbytes_per_sec": 0, 00:19:29.349 "w_mbytes_per_sec": 0 00:19:29.349 }, 00:19:29.349 "claimed": true, 00:19:29.349 "claim_type": "exclusive_write", 00:19:29.349 "zoned": false, 00:19:29.349 "supported_io_types": { 00:19:29.349 "read": true, 00:19:29.349 "write": true, 00:19:29.349 "unmap": true, 00:19:29.349 "flush": true, 00:19:29.349 "reset": true, 00:19:29.349 "nvme_admin": false, 00:19:29.349 "nvme_io": false, 00:19:29.349 "nvme_io_md": false, 00:19:29.349 "write_zeroes": true, 00:19:29.349 "zcopy": true, 00:19:29.349 "get_zone_info": false, 00:19:29.349 "zone_management": false, 00:19:29.349 "zone_append": false, 00:19:29.349 "compare": false, 00:19:29.349 "compare_and_write": false, 00:19:29.349 "abort": true, 00:19:29.349 "seek_hole": false, 00:19:29.349 "seek_data": false, 00:19:29.349 "copy": true, 00:19:29.349 "nvme_iov_md": false 00:19:29.349 }, 00:19:29.349 "memory_domains": [ 00:19:29.349 { 00:19:29.349 "dma_device_id": "system", 00:19:29.349 "dma_device_type": 1 00:19:29.349 }, 00:19:29.349 { 00:19:29.349 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.349 "dma_device_type": 2 00:19:29.349 } 00:19:29.349 ], 00:19:29.349 "driver_specific": {} 00:19:29.349 }' 00:19:29.349 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.349 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.608 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:29.868 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:29.868 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:29.868 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:29.868 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:29.868 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:29.868 "name": "BaseBdev2", 00:19:29.868 "aliases": [ 00:19:29.868 "c2ecae8f-a651-4f31-8d04-6244779a0b19" 00:19:29.868 ], 00:19:29.868 "product_name": "Malloc disk", 00:19:29.868 "block_size": 512, 00:19:29.868 "num_blocks": 65536, 00:19:29.868 "uuid": "c2ecae8f-a651-4f31-8d04-6244779a0b19", 00:19:29.868 "assigned_rate_limits": { 00:19:29.868 "rw_ios_per_sec": 0, 00:19:29.868 "rw_mbytes_per_sec": 0, 00:19:29.868 "r_mbytes_per_sec": 0, 00:19:29.868 "w_mbytes_per_sec": 0 00:19:29.868 }, 00:19:29.868 "claimed": true, 00:19:29.868 "claim_type": "exclusive_write", 00:19:29.868 "zoned": false, 00:19:29.868 "supported_io_types": { 00:19:29.868 "read": true, 00:19:29.868 "write": true, 00:19:29.868 "unmap": true, 00:19:29.868 "flush": true, 00:19:29.868 "reset": true, 00:19:29.868 "nvme_admin": false, 00:19:29.868 "nvme_io": false, 00:19:29.868 "nvme_io_md": false, 00:19:29.868 "write_zeroes": true, 00:19:29.868 "zcopy": true, 00:19:29.868 "get_zone_info": false, 00:19:29.868 "zone_management": false, 00:19:29.868 "zone_append": false, 00:19:29.868 "compare": false, 00:19:29.868 "compare_and_write": false, 00:19:29.868 "abort": true, 00:19:29.868 "seek_hole": false, 00:19:29.868 "seek_data": false, 00:19:29.868 "copy": true, 00:19:29.868 "nvme_iov_md": false 00:19:29.868 }, 00:19:29.868 "memory_domains": [ 00:19:29.868 { 00:19:29.868 "dma_device_id": "system", 00:19:29.868 "dma_device_type": 1 00:19:29.868 }, 00:19:29.868 { 00:19:29.868 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:29.868 "dma_device_type": 2 00:19:29.868 } 00:19:29.868 ], 00:19:29.868 "driver_specific": {} 00:19:29.868 }' 00:19:29.868 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.128 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.388 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.388 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.388 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.388 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:30.388 13:28:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:30.647 "name": "BaseBdev3", 00:19:30.647 "aliases": [ 00:19:30.647 "ebc9b905-2992-48a4-9737-911fb1396975" 00:19:30.647 ], 00:19:30.647 "product_name": "Malloc disk", 00:19:30.647 "block_size": 512, 00:19:30.647 "num_blocks": 65536, 00:19:30.647 "uuid": "ebc9b905-2992-48a4-9737-911fb1396975", 00:19:30.647 "assigned_rate_limits": { 00:19:30.647 "rw_ios_per_sec": 0, 00:19:30.647 "rw_mbytes_per_sec": 0, 00:19:30.647 "r_mbytes_per_sec": 0, 00:19:30.647 "w_mbytes_per_sec": 0 00:19:30.647 }, 00:19:30.647 "claimed": true, 00:19:30.647 "claim_type": "exclusive_write", 00:19:30.647 "zoned": false, 00:19:30.647 "supported_io_types": { 00:19:30.647 "read": true, 00:19:30.647 "write": true, 00:19:30.647 "unmap": true, 00:19:30.647 "flush": true, 00:19:30.647 "reset": true, 00:19:30.647 "nvme_admin": false, 00:19:30.647 "nvme_io": false, 00:19:30.647 "nvme_io_md": false, 00:19:30.647 "write_zeroes": true, 00:19:30.647 "zcopy": true, 00:19:30.647 "get_zone_info": false, 00:19:30.647 "zone_management": false, 00:19:30.647 "zone_append": false, 00:19:30.647 "compare": false, 00:19:30.647 "compare_and_write": false, 00:19:30.647 "abort": true, 00:19:30.647 "seek_hole": false, 00:19:30.647 "seek_data": false, 00:19:30.647 "copy": true, 00:19:30.647 "nvme_iov_md": false 00:19:30.647 }, 00:19:30.647 "memory_domains": [ 00:19:30.647 { 00:19:30.647 "dma_device_id": "system", 00:19:30.647 "dma_device_type": 1 00:19:30.647 }, 00:19:30.647 { 00:19:30.647 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:30.647 "dma_device_type": 2 00:19:30.647 } 00:19:30.647 ], 00:19:30.647 "driver_specific": {} 00:19:30.647 }' 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.647 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:30.907 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:30.907 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.907 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:30.907 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:30.907 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:30.907 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:30.907 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:31.167 "name": "BaseBdev4", 00:19:31.167 "aliases": [ 00:19:31.167 "5e7df02a-4939-4dc8-996e-81e29323be2a" 00:19:31.167 ], 00:19:31.167 "product_name": "Malloc disk", 00:19:31.167 "block_size": 512, 00:19:31.167 "num_blocks": 65536, 00:19:31.167 "uuid": "5e7df02a-4939-4dc8-996e-81e29323be2a", 00:19:31.167 "assigned_rate_limits": { 00:19:31.167 "rw_ios_per_sec": 0, 00:19:31.167 "rw_mbytes_per_sec": 0, 00:19:31.167 "r_mbytes_per_sec": 0, 00:19:31.167 "w_mbytes_per_sec": 0 00:19:31.167 }, 00:19:31.167 "claimed": true, 00:19:31.167 "claim_type": "exclusive_write", 00:19:31.167 "zoned": false, 00:19:31.167 "supported_io_types": { 00:19:31.167 "read": true, 00:19:31.167 "write": true, 00:19:31.167 "unmap": true, 00:19:31.167 "flush": true, 00:19:31.167 "reset": true, 00:19:31.167 "nvme_admin": false, 00:19:31.167 "nvme_io": false, 00:19:31.167 "nvme_io_md": false, 00:19:31.167 "write_zeroes": true, 00:19:31.167 "zcopy": true, 00:19:31.167 "get_zone_info": false, 00:19:31.167 "zone_management": false, 00:19:31.167 "zone_append": false, 00:19:31.167 "compare": false, 00:19:31.167 "compare_and_write": false, 00:19:31.167 "abort": true, 00:19:31.167 "seek_hole": false, 00:19:31.167 "seek_data": false, 00:19:31.167 "copy": true, 00:19:31.167 "nvme_iov_md": false 00:19:31.167 }, 00:19:31.167 "memory_domains": [ 00:19:31.167 { 00:19:31.167 "dma_device_id": "system", 00:19:31.167 "dma_device_type": 1 00:19:31.167 }, 00:19:31.167 { 00:19:31.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:31.167 "dma_device_type": 2 00:19:31.167 } 00:19:31.167 ], 00:19:31.167 "driver_specific": {} 00:19:31.167 }' 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:31.167 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.427 13:28:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:31.427 13:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:31.427 13:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.427 13:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:31.427 13:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:31.427 13:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:31.997 [2024-07-25 13:28:12.622360] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:31.997 [2024-07-25 13:28:12.622380] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:31.997 [2024-07-25 13:28:12.622420] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:31.997 [2024-07-25 13:28:12.622465] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:31.997 [2024-07-25 13:28:12.622471] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12da950 name Existed_Raid, state offline 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 960601 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 960601 ']' 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 960601 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 960601 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 960601' 00:19:31.997 killing process with pid 960601 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 960601 00:19:31.997 [2024-07-25 13:28:12.706161] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:31.997 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 960601 00:19:31.997 [2024-07-25 13:28:12.726879] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:32.257 13:28:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:19:32.257 00:19:32.257 real 0m27.593s 00:19:32.257 user 0m51.696s 00:19:32.257 sys 0m4.054s 00:19:32.257 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:32.257 13:28:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:32.257 ************************************ 00:19:32.257 END TEST raid_state_function_test 00:19:32.257 ************************************ 00:19:32.257 13:28:12 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:19:32.257 13:28:12 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:19:32.257 13:28:12 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:32.257 13:28:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:32.257 ************************************ 00:19:32.257 START TEST raid_state_function_test_sb 00:19:32.257 ************************************ 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test concat 4 true 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=965862 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 965862' 00:19:32.258 Process raid pid: 965862 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 965862 /var/tmp/spdk-raid.sock 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 965862 ']' 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:32.258 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:32.258 13:28:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:32.258 [2024-07-25 13:28:12.986170] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:19:32.258 [2024-07-25 13:28:12.986227] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:32.518 [2024-07-25 13:28:13.079074] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:32.518 [2024-07-25 13:28:13.147223] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:19:32.518 [2024-07-25 13:28:13.187314] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:32.518 [2024-07-25 13:28:13.187336] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:33.455 13:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:19:33.455 13:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:19:33.455 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:33.714 [2024-07-25 13:28:14.355356] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:33.714 [2024-07-25 13:28:14.355389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:33.714 [2024-07-25 13:28:14.355395] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:33.714 [2024-07-25 13:28:14.355400] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:33.714 [2024-07-25 13:28:14.355405] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:33.714 [2024-07-25 13:28:14.355411] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:33.715 [2024-07-25 13:28:14.355415] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:33.715 [2024-07-25 13:28:14.355421] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:33.715 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:33.974 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:33.974 "name": "Existed_Raid", 00:19:33.974 "uuid": "9ac57c3d-f6f9-4f2b-bcb2-c22a4b41ae1b", 00:19:33.974 "strip_size_kb": 64, 00:19:33.974 "state": "configuring", 00:19:33.974 "raid_level": "concat", 00:19:33.974 "superblock": true, 00:19:33.974 "num_base_bdevs": 4, 00:19:33.974 "num_base_bdevs_discovered": 0, 00:19:33.974 "num_base_bdevs_operational": 4, 00:19:33.974 "base_bdevs_list": [ 00:19:33.974 { 00:19:33.974 "name": "BaseBdev1", 00:19:33.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.974 "is_configured": false, 00:19:33.974 "data_offset": 0, 00:19:33.974 "data_size": 0 00:19:33.974 }, 00:19:33.974 { 00:19:33.974 "name": "BaseBdev2", 00:19:33.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.974 "is_configured": false, 00:19:33.974 "data_offset": 0, 00:19:33.974 "data_size": 0 00:19:33.974 }, 00:19:33.974 { 00:19:33.974 "name": "BaseBdev3", 00:19:33.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.974 "is_configured": false, 00:19:33.974 "data_offset": 0, 00:19:33.974 "data_size": 0 00:19:33.974 }, 00:19:33.974 { 00:19:33.974 "name": "BaseBdev4", 00:19:33.974 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:33.974 "is_configured": false, 00:19:33.974 "data_offset": 0, 00:19:33.974 "data_size": 0 00:19:33.974 } 00:19:33.974 ] 00:19:33.974 }' 00:19:33.974 13:28:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:33.974 13:28:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:34.577 13:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:34.577 [2024-07-25 13:28:15.249501] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:34.577 [2024-07-25 13:28:15.249526] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x24826f0 name Existed_Raid, state configuring 00:19:34.577 13:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:34.848 [2024-07-25 13:28:15.446020] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:34.848 [2024-07-25 13:28:15.446038] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:34.848 [2024-07-25 13:28:15.446044] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:34.848 [2024-07-25 13:28:15.446050] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:34.848 [2024-07-25 13:28:15.446054] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:34.848 [2024-07-25 13:28:15.446060] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:34.848 [2024-07-25 13:28:15.446065] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:34.848 [2024-07-25 13:28:15.446070] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:34.848 13:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:35.108 [2024-07-25 13:28:15.641078] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:35.108 BaseBdev1 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:35.108 13:28:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:35.367 [ 00:19:35.367 { 00:19:35.367 "name": "BaseBdev1", 00:19:35.367 "aliases": [ 00:19:35.367 "12dc7230-7499-4130-8aa9-aed035affe05" 00:19:35.367 ], 00:19:35.367 "product_name": "Malloc disk", 00:19:35.367 "block_size": 512, 00:19:35.367 "num_blocks": 65536, 00:19:35.367 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:35.367 "assigned_rate_limits": { 00:19:35.367 "rw_ios_per_sec": 0, 00:19:35.367 "rw_mbytes_per_sec": 0, 00:19:35.367 "r_mbytes_per_sec": 0, 00:19:35.367 "w_mbytes_per_sec": 0 00:19:35.367 }, 00:19:35.367 "claimed": true, 00:19:35.367 "claim_type": "exclusive_write", 00:19:35.367 "zoned": false, 00:19:35.367 "supported_io_types": { 00:19:35.367 "read": true, 00:19:35.367 "write": true, 00:19:35.367 "unmap": true, 00:19:35.367 "flush": true, 00:19:35.367 "reset": true, 00:19:35.367 "nvme_admin": false, 00:19:35.367 "nvme_io": false, 00:19:35.367 "nvme_io_md": false, 00:19:35.367 "write_zeroes": true, 00:19:35.367 "zcopy": true, 00:19:35.367 "get_zone_info": false, 00:19:35.367 "zone_management": false, 00:19:35.367 "zone_append": false, 00:19:35.367 "compare": false, 00:19:35.367 "compare_and_write": false, 00:19:35.367 "abort": true, 00:19:35.367 "seek_hole": false, 00:19:35.367 "seek_data": false, 00:19:35.367 "copy": true, 00:19:35.367 "nvme_iov_md": false 00:19:35.367 }, 00:19:35.367 "memory_domains": [ 00:19:35.367 { 00:19:35.367 "dma_device_id": "system", 00:19:35.367 "dma_device_type": 1 00:19:35.367 }, 00:19:35.367 { 00:19:35.367 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:35.367 "dma_device_type": 2 00:19:35.367 } 00:19:35.367 ], 00:19:35.367 "driver_specific": {} 00:19:35.367 } 00:19:35.367 ] 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:35.367 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:35.626 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:35.626 "name": "Existed_Raid", 00:19:35.626 "uuid": "bcac1da6-a8e0-4adb-8551-86e822c1f03d", 00:19:35.626 "strip_size_kb": 64, 00:19:35.626 "state": "configuring", 00:19:35.626 "raid_level": "concat", 00:19:35.626 "superblock": true, 00:19:35.626 "num_base_bdevs": 4, 00:19:35.626 "num_base_bdevs_discovered": 1, 00:19:35.626 "num_base_bdevs_operational": 4, 00:19:35.626 "base_bdevs_list": [ 00:19:35.626 { 00:19:35.626 "name": "BaseBdev1", 00:19:35.626 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:35.626 "is_configured": true, 00:19:35.626 "data_offset": 2048, 00:19:35.626 "data_size": 63488 00:19:35.626 }, 00:19:35.626 { 00:19:35.626 "name": "BaseBdev2", 00:19:35.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.626 "is_configured": false, 00:19:35.626 "data_offset": 0, 00:19:35.626 "data_size": 0 00:19:35.626 }, 00:19:35.626 { 00:19:35.626 "name": "BaseBdev3", 00:19:35.626 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.626 "is_configured": false, 00:19:35.626 "data_offset": 0, 00:19:35.626 "data_size": 0 00:19:35.626 }, 00:19:35.626 { 00:19:35.626 "name": "BaseBdev4", 00:19:35.627 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:35.627 "is_configured": false, 00:19:35.627 "data_offset": 0, 00:19:35.627 "data_size": 0 00:19:35.627 } 00:19:35.627 ] 00:19:35.627 }' 00:19:35.627 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:35.627 13:28:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:36.195 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:36.195 [2024-07-25 13:28:16.952407] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:36.195 [2024-07-25 13:28:16.952435] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2481f60 name Existed_Raid, state configuring 00:19:36.195 13:28:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:36.455 [2024-07-25 13:28:17.148946] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:36.455 [2024-07-25 13:28:17.150085] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:36.455 [2024-07-25 13:28:17.150108] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:36.455 [2024-07-25 13:28:17.150114] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:36.455 [2024-07-25 13:28:17.150120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:36.455 [2024-07-25 13:28:17.150125] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:36.455 [2024-07-25 13:28:17.150130] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.455 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:36.715 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.715 "name": "Existed_Raid", 00:19:36.715 "uuid": "4f99d319-6a71-43cf-943d-5b8259e13ce0", 00:19:36.715 "strip_size_kb": 64, 00:19:36.715 "state": "configuring", 00:19:36.715 "raid_level": "concat", 00:19:36.715 "superblock": true, 00:19:36.715 "num_base_bdevs": 4, 00:19:36.715 "num_base_bdevs_discovered": 1, 00:19:36.715 "num_base_bdevs_operational": 4, 00:19:36.715 "base_bdevs_list": [ 00:19:36.715 { 00:19:36.715 "name": "BaseBdev1", 00:19:36.715 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:36.715 "is_configured": true, 00:19:36.715 "data_offset": 2048, 00:19:36.715 "data_size": 63488 00:19:36.715 }, 00:19:36.715 { 00:19:36.715 "name": "BaseBdev2", 00:19:36.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.715 "is_configured": false, 00:19:36.715 "data_offset": 0, 00:19:36.715 "data_size": 0 00:19:36.715 }, 00:19:36.715 { 00:19:36.715 "name": "BaseBdev3", 00:19:36.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.715 "is_configured": false, 00:19:36.715 "data_offset": 0, 00:19:36.715 "data_size": 0 00:19:36.715 }, 00:19:36.715 { 00:19:36.715 "name": "BaseBdev4", 00:19:36.715 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:36.715 "is_configured": false, 00:19:36.715 "data_offset": 0, 00:19:36.715 "data_size": 0 00:19:36.715 } 00:19:36.715 ] 00:19:36.715 }' 00:19:36.715 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.715 13:28:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:37.284 13:28:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:37.544 [2024-07-25 13:28:18.104173] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:37.544 BaseBdev2 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:37.544 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:37.804 [ 00:19:37.804 { 00:19:37.804 "name": "BaseBdev2", 00:19:37.804 "aliases": [ 00:19:37.804 "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2" 00:19:37.804 ], 00:19:37.804 "product_name": "Malloc disk", 00:19:37.804 "block_size": 512, 00:19:37.804 "num_blocks": 65536, 00:19:37.804 "uuid": "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2", 00:19:37.804 "assigned_rate_limits": { 00:19:37.804 "rw_ios_per_sec": 0, 00:19:37.804 "rw_mbytes_per_sec": 0, 00:19:37.804 "r_mbytes_per_sec": 0, 00:19:37.804 "w_mbytes_per_sec": 0 00:19:37.804 }, 00:19:37.804 "claimed": true, 00:19:37.804 "claim_type": "exclusive_write", 00:19:37.804 "zoned": false, 00:19:37.804 "supported_io_types": { 00:19:37.804 "read": true, 00:19:37.804 "write": true, 00:19:37.804 "unmap": true, 00:19:37.804 "flush": true, 00:19:37.804 "reset": true, 00:19:37.804 "nvme_admin": false, 00:19:37.804 "nvme_io": false, 00:19:37.804 "nvme_io_md": false, 00:19:37.804 "write_zeroes": true, 00:19:37.804 "zcopy": true, 00:19:37.804 "get_zone_info": false, 00:19:37.804 "zone_management": false, 00:19:37.804 "zone_append": false, 00:19:37.804 "compare": false, 00:19:37.804 "compare_and_write": false, 00:19:37.804 "abort": true, 00:19:37.804 "seek_hole": false, 00:19:37.804 "seek_data": false, 00:19:37.804 "copy": true, 00:19:37.804 "nvme_iov_md": false 00:19:37.804 }, 00:19:37.804 "memory_domains": [ 00:19:37.804 { 00:19:37.804 "dma_device_id": "system", 00:19:37.804 "dma_device_type": 1 00:19:37.804 }, 00:19:37.804 { 00:19:37.804 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:37.804 "dma_device_type": 2 00:19:37.804 } 00:19:37.804 ], 00:19:37.804 "driver_specific": {} 00:19:37.804 } 00:19:37.804 ] 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:37.804 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:38.063 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:38.063 "name": "Existed_Raid", 00:19:38.063 "uuid": "4f99d319-6a71-43cf-943d-5b8259e13ce0", 00:19:38.063 "strip_size_kb": 64, 00:19:38.063 "state": "configuring", 00:19:38.063 "raid_level": "concat", 00:19:38.063 "superblock": true, 00:19:38.063 "num_base_bdevs": 4, 00:19:38.063 "num_base_bdevs_discovered": 2, 00:19:38.063 "num_base_bdevs_operational": 4, 00:19:38.063 "base_bdevs_list": [ 00:19:38.063 { 00:19:38.063 "name": "BaseBdev1", 00:19:38.063 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:38.063 "is_configured": true, 00:19:38.063 "data_offset": 2048, 00:19:38.063 "data_size": 63488 00:19:38.063 }, 00:19:38.063 { 00:19:38.063 "name": "BaseBdev2", 00:19:38.063 "uuid": "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2", 00:19:38.063 "is_configured": true, 00:19:38.063 "data_offset": 2048, 00:19:38.064 "data_size": 63488 00:19:38.064 }, 00:19:38.064 { 00:19:38.064 "name": "BaseBdev3", 00:19:38.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.064 "is_configured": false, 00:19:38.064 "data_offset": 0, 00:19:38.064 "data_size": 0 00:19:38.064 }, 00:19:38.064 { 00:19:38.064 "name": "BaseBdev4", 00:19:38.064 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:38.064 "is_configured": false, 00:19:38.064 "data_offset": 0, 00:19:38.064 "data_size": 0 00:19:38.064 } 00:19:38.064 ] 00:19:38.064 }' 00:19:38.064 13:28:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:38.064 13:28:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:38.633 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:38.893 [2024-07-25 13:28:19.428374] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:38.893 BaseBdev3 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:38.893 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:39.152 [ 00:19:39.152 { 00:19:39.152 "name": "BaseBdev3", 00:19:39.152 "aliases": [ 00:19:39.152 "fdbc3b23-3170-47e1-80a5-a81812bcb44b" 00:19:39.152 ], 00:19:39.152 "product_name": "Malloc disk", 00:19:39.152 "block_size": 512, 00:19:39.152 "num_blocks": 65536, 00:19:39.152 "uuid": "fdbc3b23-3170-47e1-80a5-a81812bcb44b", 00:19:39.152 "assigned_rate_limits": { 00:19:39.152 "rw_ios_per_sec": 0, 00:19:39.152 "rw_mbytes_per_sec": 0, 00:19:39.152 "r_mbytes_per_sec": 0, 00:19:39.152 "w_mbytes_per_sec": 0 00:19:39.152 }, 00:19:39.152 "claimed": true, 00:19:39.152 "claim_type": "exclusive_write", 00:19:39.152 "zoned": false, 00:19:39.152 "supported_io_types": { 00:19:39.152 "read": true, 00:19:39.152 "write": true, 00:19:39.152 "unmap": true, 00:19:39.152 "flush": true, 00:19:39.152 "reset": true, 00:19:39.152 "nvme_admin": false, 00:19:39.152 "nvme_io": false, 00:19:39.152 "nvme_io_md": false, 00:19:39.152 "write_zeroes": true, 00:19:39.152 "zcopy": true, 00:19:39.152 "get_zone_info": false, 00:19:39.152 "zone_management": false, 00:19:39.152 "zone_append": false, 00:19:39.152 "compare": false, 00:19:39.152 "compare_and_write": false, 00:19:39.152 "abort": true, 00:19:39.152 "seek_hole": false, 00:19:39.152 "seek_data": false, 00:19:39.152 "copy": true, 00:19:39.152 "nvme_iov_md": false 00:19:39.152 }, 00:19:39.152 "memory_domains": [ 00:19:39.152 { 00:19:39.152 "dma_device_id": "system", 00:19:39.152 "dma_device_type": 1 00:19:39.152 }, 00:19:39.152 { 00:19:39.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:39.152 "dma_device_type": 2 00:19:39.152 } 00:19:39.152 ], 00:19:39.152 "driver_specific": {} 00:19:39.152 } 00:19:39.152 ] 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:39.152 13:28:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.412 13:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.412 "name": "Existed_Raid", 00:19:39.412 "uuid": "4f99d319-6a71-43cf-943d-5b8259e13ce0", 00:19:39.412 "strip_size_kb": 64, 00:19:39.412 "state": "configuring", 00:19:39.412 "raid_level": "concat", 00:19:39.412 "superblock": true, 00:19:39.412 "num_base_bdevs": 4, 00:19:39.412 "num_base_bdevs_discovered": 3, 00:19:39.412 "num_base_bdevs_operational": 4, 00:19:39.412 "base_bdevs_list": [ 00:19:39.412 { 00:19:39.412 "name": "BaseBdev1", 00:19:39.412 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:39.412 "is_configured": true, 00:19:39.412 "data_offset": 2048, 00:19:39.412 "data_size": 63488 00:19:39.412 }, 00:19:39.412 { 00:19:39.412 "name": "BaseBdev2", 00:19:39.412 "uuid": "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2", 00:19:39.412 "is_configured": true, 00:19:39.412 "data_offset": 2048, 00:19:39.412 "data_size": 63488 00:19:39.412 }, 00:19:39.412 { 00:19:39.412 "name": "BaseBdev3", 00:19:39.412 "uuid": "fdbc3b23-3170-47e1-80a5-a81812bcb44b", 00:19:39.412 "is_configured": true, 00:19:39.412 "data_offset": 2048, 00:19:39.412 "data_size": 63488 00:19:39.412 }, 00:19:39.412 { 00:19:39.412 "name": "BaseBdev4", 00:19:39.412 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.412 "is_configured": false, 00:19:39.412 "data_offset": 0, 00:19:39.412 "data_size": 0 00:19:39.412 } 00:19:39.412 ] 00:19:39.412 }' 00:19:39.412 13:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.412 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:39.981 [2024-07-25 13:28:20.740530] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:39.981 [2024-07-25 13:28:20.740663] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2482fd0 00:19:39.981 [2024-07-25 13:28:20.740672] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:39.981 [2024-07-25 13:28:20.740808] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26278e0 00:19:39.981 [2024-07-25 13:28:20.740902] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2482fd0 00:19:39.981 [2024-07-25 13:28:20.740907] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2482fd0 00:19:39.981 [2024-07-25 13:28:20.740975] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:39.981 BaseBdev4 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:39.981 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:40.241 13:28:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:40.502 [ 00:19:40.502 { 00:19:40.502 "name": "BaseBdev4", 00:19:40.502 "aliases": [ 00:19:40.502 "c8fa1fcb-0a80-422b-b5b7-95a6dbe8c7c3" 00:19:40.502 ], 00:19:40.502 "product_name": "Malloc disk", 00:19:40.502 "block_size": 512, 00:19:40.502 "num_blocks": 65536, 00:19:40.502 "uuid": "c8fa1fcb-0a80-422b-b5b7-95a6dbe8c7c3", 00:19:40.502 "assigned_rate_limits": { 00:19:40.502 "rw_ios_per_sec": 0, 00:19:40.502 "rw_mbytes_per_sec": 0, 00:19:40.502 "r_mbytes_per_sec": 0, 00:19:40.502 "w_mbytes_per_sec": 0 00:19:40.502 }, 00:19:40.502 "claimed": true, 00:19:40.502 "claim_type": "exclusive_write", 00:19:40.502 "zoned": false, 00:19:40.502 "supported_io_types": { 00:19:40.502 "read": true, 00:19:40.502 "write": true, 00:19:40.502 "unmap": true, 00:19:40.502 "flush": true, 00:19:40.502 "reset": true, 00:19:40.502 "nvme_admin": false, 00:19:40.502 "nvme_io": false, 00:19:40.502 "nvme_io_md": false, 00:19:40.502 "write_zeroes": true, 00:19:40.502 "zcopy": true, 00:19:40.502 "get_zone_info": false, 00:19:40.502 "zone_management": false, 00:19:40.502 "zone_append": false, 00:19:40.502 "compare": false, 00:19:40.502 "compare_and_write": false, 00:19:40.502 "abort": true, 00:19:40.502 "seek_hole": false, 00:19:40.502 "seek_data": false, 00:19:40.502 "copy": true, 00:19:40.502 "nvme_iov_md": false 00:19:40.502 }, 00:19:40.502 "memory_domains": [ 00:19:40.502 { 00:19:40.503 "dma_device_id": "system", 00:19:40.503 "dma_device_type": 1 00:19:40.503 }, 00:19:40.503 { 00:19:40.503 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:40.503 "dma_device_type": 2 00:19:40.503 } 00:19:40.503 ], 00:19:40.503 "driver_specific": {} 00:19:40.503 } 00:19:40.503 ] 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:40.503 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:40.763 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:40.763 "name": "Existed_Raid", 00:19:40.763 "uuid": "4f99d319-6a71-43cf-943d-5b8259e13ce0", 00:19:40.763 "strip_size_kb": 64, 00:19:40.763 "state": "online", 00:19:40.763 "raid_level": "concat", 00:19:40.763 "superblock": true, 00:19:40.763 "num_base_bdevs": 4, 00:19:40.763 "num_base_bdevs_discovered": 4, 00:19:40.763 "num_base_bdevs_operational": 4, 00:19:40.763 "base_bdevs_list": [ 00:19:40.763 { 00:19:40.763 "name": "BaseBdev1", 00:19:40.763 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:40.763 "is_configured": true, 00:19:40.763 "data_offset": 2048, 00:19:40.763 "data_size": 63488 00:19:40.763 }, 00:19:40.763 { 00:19:40.763 "name": "BaseBdev2", 00:19:40.763 "uuid": "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2", 00:19:40.763 "is_configured": true, 00:19:40.763 "data_offset": 2048, 00:19:40.763 "data_size": 63488 00:19:40.763 }, 00:19:40.763 { 00:19:40.763 "name": "BaseBdev3", 00:19:40.763 "uuid": "fdbc3b23-3170-47e1-80a5-a81812bcb44b", 00:19:40.763 "is_configured": true, 00:19:40.763 "data_offset": 2048, 00:19:40.763 "data_size": 63488 00:19:40.763 }, 00:19:40.763 { 00:19:40.763 "name": "BaseBdev4", 00:19:40.763 "uuid": "c8fa1fcb-0a80-422b-b5b7-95a6dbe8c7c3", 00:19:40.763 "is_configured": true, 00:19:40.763 "data_offset": 2048, 00:19:40.763 "data_size": 63488 00:19:40.763 } 00:19:40.763 ] 00:19:40.763 }' 00:19:40.763 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:40.763 13:28:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:41.333 13:28:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:41.333 [2024-07-25 13:28:22.004004] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:41.333 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:41.333 "name": "Existed_Raid", 00:19:41.333 "aliases": [ 00:19:41.333 "4f99d319-6a71-43cf-943d-5b8259e13ce0" 00:19:41.333 ], 00:19:41.333 "product_name": "Raid Volume", 00:19:41.333 "block_size": 512, 00:19:41.333 "num_blocks": 253952, 00:19:41.333 "uuid": "4f99d319-6a71-43cf-943d-5b8259e13ce0", 00:19:41.333 "assigned_rate_limits": { 00:19:41.333 "rw_ios_per_sec": 0, 00:19:41.333 "rw_mbytes_per_sec": 0, 00:19:41.333 "r_mbytes_per_sec": 0, 00:19:41.333 "w_mbytes_per_sec": 0 00:19:41.333 }, 00:19:41.333 "claimed": false, 00:19:41.333 "zoned": false, 00:19:41.333 "supported_io_types": { 00:19:41.333 "read": true, 00:19:41.333 "write": true, 00:19:41.333 "unmap": true, 00:19:41.333 "flush": true, 00:19:41.333 "reset": true, 00:19:41.333 "nvme_admin": false, 00:19:41.333 "nvme_io": false, 00:19:41.333 "nvme_io_md": false, 00:19:41.333 "write_zeroes": true, 00:19:41.333 "zcopy": false, 00:19:41.333 "get_zone_info": false, 00:19:41.333 "zone_management": false, 00:19:41.333 "zone_append": false, 00:19:41.333 "compare": false, 00:19:41.333 "compare_and_write": false, 00:19:41.333 "abort": false, 00:19:41.333 "seek_hole": false, 00:19:41.333 "seek_data": false, 00:19:41.333 "copy": false, 00:19:41.333 "nvme_iov_md": false 00:19:41.333 }, 00:19:41.333 "memory_domains": [ 00:19:41.333 { 00:19:41.333 "dma_device_id": "system", 00:19:41.333 "dma_device_type": 1 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.333 "dma_device_type": 2 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "dma_device_id": "system", 00:19:41.333 "dma_device_type": 1 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.333 "dma_device_type": 2 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "dma_device_id": "system", 00:19:41.333 "dma_device_type": 1 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.333 "dma_device_type": 2 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "dma_device_id": "system", 00:19:41.333 "dma_device_type": 1 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.333 "dma_device_type": 2 00:19:41.333 } 00:19:41.333 ], 00:19:41.333 "driver_specific": { 00:19:41.333 "raid": { 00:19:41.333 "uuid": "4f99d319-6a71-43cf-943d-5b8259e13ce0", 00:19:41.333 "strip_size_kb": 64, 00:19:41.333 "state": "online", 00:19:41.333 "raid_level": "concat", 00:19:41.333 "superblock": true, 00:19:41.333 "num_base_bdevs": 4, 00:19:41.333 "num_base_bdevs_discovered": 4, 00:19:41.333 "num_base_bdevs_operational": 4, 00:19:41.333 "base_bdevs_list": [ 00:19:41.333 { 00:19:41.333 "name": "BaseBdev1", 00:19:41.333 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:41.333 "is_configured": true, 00:19:41.333 "data_offset": 2048, 00:19:41.333 "data_size": 63488 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "name": "BaseBdev2", 00:19:41.333 "uuid": "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2", 00:19:41.333 "is_configured": true, 00:19:41.333 "data_offset": 2048, 00:19:41.333 "data_size": 63488 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "name": "BaseBdev3", 00:19:41.333 "uuid": "fdbc3b23-3170-47e1-80a5-a81812bcb44b", 00:19:41.333 "is_configured": true, 00:19:41.333 "data_offset": 2048, 00:19:41.333 "data_size": 63488 00:19:41.333 }, 00:19:41.333 { 00:19:41.333 "name": "BaseBdev4", 00:19:41.333 "uuid": "c8fa1fcb-0a80-422b-b5b7-95a6dbe8c7c3", 00:19:41.333 "is_configured": true, 00:19:41.333 "data_offset": 2048, 00:19:41.333 "data_size": 63488 00:19:41.333 } 00:19:41.333 ] 00:19:41.333 } 00:19:41.333 } 00:19:41.333 }' 00:19:41.333 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:41.333 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:41.333 BaseBdev2 00:19:41.333 BaseBdev3 00:19:41.333 BaseBdev4' 00:19:41.333 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.333 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:41.333 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:41.593 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:41.593 "name": "BaseBdev1", 00:19:41.593 "aliases": [ 00:19:41.593 "12dc7230-7499-4130-8aa9-aed035affe05" 00:19:41.593 ], 00:19:41.593 "product_name": "Malloc disk", 00:19:41.593 "block_size": 512, 00:19:41.593 "num_blocks": 65536, 00:19:41.593 "uuid": "12dc7230-7499-4130-8aa9-aed035affe05", 00:19:41.593 "assigned_rate_limits": { 00:19:41.593 "rw_ios_per_sec": 0, 00:19:41.593 "rw_mbytes_per_sec": 0, 00:19:41.593 "r_mbytes_per_sec": 0, 00:19:41.593 "w_mbytes_per_sec": 0 00:19:41.593 }, 00:19:41.594 "claimed": true, 00:19:41.594 "claim_type": "exclusive_write", 00:19:41.594 "zoned": false, 00:19:41.594 "supported_io_types": { 00:19:41.594 "read": true, 00:19:41.594 "write": true, 00:19:41.594 "unmap": true, 00:19:41.594 "flush": true, 00:19:41.594 "reset": true, 00:19:41.594 "nvme_admin": false, 00:19:41.594 "nvme_io": false, 00:19:41.594 "nvme_io_md": false, 00:19:41.594 "write_zeroes": true, 00:19:41.594 "zcopy": true, 00:19:41.594 "get_zone_info": false, 00:19:41.594 "zone_management": false, 00:19:41.594 "zone_append": false, 00:19:41.594 "compare": false, 00:19:41.594 "compare_and_write": false, 00:19:41.594 "abort": true, 00:19:41.594 "seek_hole": false, 00:19:41.594 "seek_data": false, 00:19:41.594 "copy": true, 00:19:41.594 "nvme_iov_md": false 00:19:41.594 }, 00:19:41.594 "memory_domains": [ 00:19:41.594 { 00:19:41.594 "dma_device_id": "system", 00:19:41.594 "dma_device_type": 1 00:19:41.594 }, 00:19:41.594 { 00:19:41.594 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.594 "dma_device_type": 2 00:19:41.594 } 00:19:41.594 ], 00:19:41.594 "driver_specific": {} 00:19:41.594 }' 00:19:41.594 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.594 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:41.594 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:41.594 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.594 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:41.854 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.115 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.115 "name": "BaseBdev2", 00:19:42.115 "aliases": [ 00:19:42.115 "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2" 00:19:42.115 ], 00:19:42.115 "product_name": "Malloc disk", 00:19:42.115 "block_size": 512, 00:19:42.115 "num_blocks": 65536, 00:19:42.115 "uuid": "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2", 00:19:42.115 "assigned_rate_limits": { 00:19:42.115 "rw_ios_per_sec": 0, 00:19:42.115 "rw_mbytes_per_sec": 0, 00:19:42.115 "r_mbytes_per_sec": 0, 00:19:42.115 "w_mbytes_per_sec": 0 00:19:42.115 }, 00:19:42.115 "claimed": true, 00:19:42.115 "claim_type": "exclusive_write", 00:19:42.115 "zoned": false, 00:19:42.115 "supported_io_types": { 00:19:42.115 "read": true, 00:19:42.115 "write": true, 00:19:42.115 "unmap": true, 00:19:42.115 "flush": true, 00:19:42.115 "reset": true, 00:19:42.115 "nvme_admin": false, 00:19:42.115 "nvme_io": false, 00:19:42.115 "nvme_io_md": false, 00:19:42.115 "write_zeroes": true, 00:19:42.115 "zcopy": true, 00:19:42.115 "get_zone_info": false, 00:19:42.115 "zone_management": false, 00:19:42.115 "zone_append": false, 00:19:42.115 "compare": false, 00:19:42.115 "compare_and_write": false, 00:19:42.115 "abort": true, 00:19:42.115 "seek_hole": false, 00:19:42.115 "seek_data": false, 00:19:42.115 "copy": true, 00:19:42.115 "nvme_iov_md": false 00:19:42.115 }, 00:19:42.115 "memory_domains": [ 00:19:42.115 { 00:19:42.115 "dma_device_id": "system", 00:19:42.115 "dma_device_type": 1 00:19:42.115 }, 00:19:42.115 { 00:19:42.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.115 "dma_device_type": 2 00:19:42.115 } 00:19:42.115 ], 00:19:42.115 "driver_specific": {} 00:19:42.115 }' 00:19:42.115 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.115 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.115 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.115 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.376 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.376 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.376 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.376 13:28:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.376 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.376 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.376 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.376 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.376 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.376 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:42.376 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:42.636 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:42.636 "name": "BaseBdev3", 00:19:42.636 "aliases": [ 00:19:42.636 "fdbc3b23-3170-47e1-80a5-a81812bcb44b" 00:19:42.636 ], 00:19:42.636 "product_name": "Malloc disk", 00:19:42.636 "block_size": 512, 00:19:42.636 "num_blocks": 65536, 00:19:42.636 "uuid": "fdbc3b23-3170-47e1-80a5-a81812bcb44b", 00:19:42.636 "assigned_rate_limits": { 00:19:42.636 "rw_ios_per_sec": 0, 00:19:42.636 "rw_mbytes_per_sec": 0, 00:19:42.636 "r_mbytes_per_sec": 0, 00:19:42.636 "w_mbytes_per_sec": 0 00:19:42.636 }, 00:19:42.636 "claimed": true, 00:19:42.636 "claim_type": "exclusive_write", 00:19:42.636 "zoned": false, 00:19:42.636 "supported_io_types": { 00:19:42.636 "read": true, 00:19:42.636 "write": true, 00:19:42.636 "unmap": true, 00:19:42.636 "flush": true, 00:19:42.636 "reset": true, 00:19:42.636 "nvme_admin": false, 00:19:42.636 "nvme_io": false, 00:19:42.636 "nvme_io_md": false, 00:19:42.636 "write_zeroes": true, 00:19:42.636 "zcopy": true, 00:19:42.636 "get_zone_info": false, 00:19:42.636 "zone_management": false, 00:19:42.636 "zone_append": false, 00:19:42.636 "compare": false, 00:19:42.636 "compare_and_write": false, 00:19:42.636 "abort": true, 00:19:42.636 "seek_hole": false, 00:19:42.636 "seek_data": false, 00:19:42.636 "copy": true, 00:19:42.636 "nvme_iov_md": false 00:19:42.636 }, 00:19:42.636 "memory_domains": [ 00:19:42.636 { 00:19:42.636 "dma_device_id": "system", 00:19:42.636 "dma_device_type": 1 00:19:42.636 }, 00:19:42.636 { 00:19:42.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:42.636 "dma_device_type": 2 00:19:42.636 } 00:19:42.636 ], 00:19:42.636 "driver_specific": {} 00:19:42.636 }' 00:19:42.636 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.636 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:42.636 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:42.636 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.636 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:42.897 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:43.157 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:43.157 "name": "BaseBdev4", 00:19:43.157 "aliases": [ 00:19:43.157 "c8fa1fcb-0a80-422b-b5b7-95a6dbe8c7c3" 00:19:43.157 ], 00:19:43.157 "product_name": "Malloc disk", 00:19:43.157 "block_size": 512, 00:19:43.157 "num_blocks": 65536, 00:19:43.157 "uuid": "c8fa1fcb-0a80-422b-b5b7-95a6dbe8c7c3", 00:19:43.157 "assigned_rate_limits": { 00:19:43.157 "rw_ios_per_sec": 0, 00:19:43.157 "rw_mbytes_per_sec": 0, 00:19:43.157 "r_mbytes_per_sec": 0, 00:19:43.157 "w_mbytes_per_sec": 0 00:19:43.157 }, 00:19:43.157 "claimed": true, 00:19:43.157 "claim_type": "exclusive_write", 00:19:43.157 "zoned": false, 00:19:43.157 "supported_io_types": { 00:19:43.157 "read": true, 00:19:43.157 "write": true, 00:19:43.157 "unmap": true, 00:19:43.157 "flush": true, 00:19:43.157 "reset": true, 00:19:43.157 "nvme_admin": false, 00:19:43.157 "nvme_io": false, 00:19:43.157 "nvme_io_md": false, 00:19:43.157 "write_zeroes": true, 00:19:43.157 "zcopy": true, 00:19:43.157 "get_zone_info": false, 00:19:43.157 "zone_management": false, 00:19:43.157 "zone_append": false, 00:19:43.157 "compare": false, 00:19:43.157 "compare_and_write": false, 00:19:43.157 "abort": true, 00:19:43.157 "seek_hole": false, 00:19:43.157 "seek_data": false, 00:19:43.157 "copy": true, 00:19:43.157 "nvme_iov_md": false 00:19:43.157 }, 00:19:43.157 "memory_domains": [ 00:19:43.157 { 00:19:43.157 "dma_device_id": "system", 00:19:43.157 "dma_device_type": 1 00:19:43.157 }, 00:19:43.157 { 00:19:43.157 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:43.157 "dma_device_type": 2 00:19:43.157 } 00:19:43.157 ], 00:19:43.157 "driver_specific": {} 00:19:43.157 }' 00:19:43.157 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.157 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:43.157 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:43.157 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.418 13:28:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:43.418 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:43.679 [2024-07-25 13:28:24.365762] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:43.679 [2024-07-25 13:28:24.365786] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:43.679 [2024-07-25 13:28:24.365834] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:43.679 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:43.940 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.940 "name": "Existed_Raid", 00:19:43.940 "uuid": "4f99d319-6a71-43cf-943d-5b8259e13ce0", 00:19:43.940 "strip_size_kb": 64, 00:19:43.940 "state": "offline", 00:19:43.940 "raid_level": "concat", 00:19:43.940 "superblock": true, 00:19:43.940 "num_base_bdevs": 4, 00:19:43.940 "num_base_bdevs_discovered": 3, 00:19:43.940 "num_base_bdevs_operational": 3, 00:19:43.940 "base_bdevs_list": [ 00:19:43.940 { 00:19:43.940 "name": null, 00:19:43.940 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.940 "is_configured": false, 00:19:43.940 "data_offset": 2048, 00:19:43.940 "data_size": 63488 00:19:43.940 }, 00:19:43.940 { 00:19:43.940 "name": "BaseBdev2", 00:19:43.940 "uuid": "40303f8e-6c34-4f7e-89e6-87d6deb3f7f2", 00:19:43.940 "is_configured": true, 00:19:43.940 "data_offset": 2048, 00:19:43.940 "data_size": 63488 00:19:43.940 }, 00:19:43.940 { 00:19:43.940 "name": "BaseBdev3", 00:19:43.940 "uuid": "fdbc3b23-3170-47e1-80a5-a81812bcb44b", 00:19:43.940 "is_configured": true, 00:19:43.940 "data_offset": 2048, 00:19:43.940 "data_size": 63488 00:19:43.940 }, 00:19:43.940 { 00:19:43.940 "name": "BaseBdev4", 00:19:43.940 "uuid": "c8fa1fcb-0a80-422b-b5b7-95a6dbe8c7c3", 00:19:43.940 "is_configured": true, 00:19:43.940 "data_offset": 2048, 00:19:43.940 "data_size": 63488 00:19:43.940 } 00:19:43.940 ] 00:19:43.940 }' 00:19:43.940 13:28:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.940 13:28:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:44.510 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:44.510 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:44.510 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.510 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:44.770 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:44.770 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:44.770 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:44.770 [2024-07-25 13:28:25.516667] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:44.770 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:44.770 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:44.770 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.770 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:45.029 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:45.029 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:45.029 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:45.289 [2024-07-25 13:28:25.907543] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:45.289 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:45.289 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:45.289 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.289 13:28:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:45.550 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:45.550 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:45.550 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:45.550 [2024-07-25 13:28:26.294307] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:45.550 [2024-07-25 13:28:26.294335] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2482fd0 name Existed_Raid, state offline 00:19:45.550 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:45.550 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:45.550 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:45.550 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:45.816 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:45.816 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:45.816 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:45.816 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:45.816 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:45.816 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:46.076 BaseBdev2 00:19:46.076 13:28:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:46.076 13:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:19:46.076 13:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:46.076 13:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:46.076 13:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:46.076 13:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:46.076 13:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:46.337 13:28:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:46.337 [ 00:19:46.337 { 00:19:46.337 "name": "BaseBdev2", 00:19:46.337 "aliases": [ 00:19:46.337 "2862607f-e67d-48cd-96c2-61a913c21ea7" 00:19:46.337 ], 00:19:46.337 "product_name": "Malloc disk", 00:19:46.337 "block_size": 512, 00:19:46.337 "num_blocks": 65536, 00:19:46.337 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:46.337 "assigned_rate_limits": { 00:19:46.337 "rw_ios_per_sec": 0, 00:19:46.337 "rw_mbytes_per_sec": 0, 00:19:46.337 "r_mbytes_per_sec": 0, 00:19:46.337 "w_mbytes_per_sec": 0 00:19:46.337 }, 00:19:46.337 "claimed": false, 00:19:46.337 "zoned": false, 00:19:46.337 "supported_io_types": { 00:19:46.337 "read": true, 00:19:46.337 "write": true, 00:19:46.337 "unmap": true, 00:19:46.337 "flush": true, 00:19:46.337 "reset": true, 00:19:46.337 "nvme_admin": false, 00:19:46.337 "nvme_io": false, 00:19:46.337 "nvme_io_md": false, 00:19:46.337 "write_zeroes": true, 00:19:46.337 "zcopy": true, 00:19:46.337 "get_zone_info": false, 00:19:46.337 "zone_management": false, 00:19:46.337 "zone_append": false, 00:19:46.337 "compare": false, 00:19:46.337 "compare_and_write": false, 00:19:46.337 "abort": true, 00:19:46.337 "seek_hole": false, 00:19:46.337 "seek_data": false, 00:19:46.337 "copy": true, 00:19:46.337 "nvme_iov_md": false 00:19:46.337 }, 00:19:46.337 "memory_domains": [ 00:19:46.337 { 00:19:46.337 "dma_device_id": "system", 00:19:46.337 "dma_device_type": 1 00:19:46.337 }, 00:19:46.337 { 00:19:46.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.337 "dma_device_type": 2 00:19:46.337 } 00:19:46.337 ], 00:19:46.337 "driver_specific": {} 00:19:46.337 } 00:19:46.337 ] 00:19:46.337 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:46.337 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:46.337 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:46.337 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:46.597 BaseBdev3 00:19:46.597 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:46.597 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:19:46.597 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:46.597 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:46.597 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:46.597 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:46.597 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:46.856 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:46.856 [ 00:19:46.856 { 00:19:46.856 "name": "BaseBdev3", 00:19:46.856 "aliases": [ 00:19:46.856 "5a887f2a-0141-4dfe-a42f-3ebfb43826c1" 00:19:46.856 ], 00:19:46.856 "product_name": "Malloc disk", 00:19:46.856 "block_size": 512, 00:19:46.856 "num_blocks": 65536, 00:19:46.856 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:46.856 "assigned_rate_limits": { 00:19:46.856 "rw_ios_per_sec": 0, 00:19:46.856 "rw_mbytes_per_sec": 0, 00:19:46.856 "r_mbytes_per_sec": 0, 00:19:46.856 "w_mbytes_per_sec": 0 00:19:46.856 }, 00:19:46.856 "claimed": false, 00:19:46.856 "zoned": false, 00:19:46.856 "supported_io_types": { 00:19:46.856 "read": true, 00:19:46.856 "write": true, 00:19:46.856 "unmap": true, 00:19:46.856 "flush": true, 00:19:46.856 "reset": true, 00:19:46.856 "nvme_admin": false, 00:19:46.856 "nvme_io": false, 00:19:46.856 "nvme_io_md": false, 00:19:46.856 "write_zeroes": true, 00:19:46.856 "zcopy": true, 00:19:46.856 "get_zone_info": false, 00:19:46.856 "zone_management": false, 00:19:46.856 "zone_append": false, 00:19:46.856 "compare": false, 00:19:46.856 "compare_and_write": false, 00:19:46.856 "abort": true, 00:19:46.856 "seek_hole": false, 00:19:46.856 "seek_data": false, 00:19:46.856 "copy": true, 00:19:46.856 "nvme_iov_md": false 00:19:46.856 }, 00:19:46.856 "memory_domains": [ 00:19:46.856 { 00:19:46.856 "dma_device_id": "system", 00:19:46.856 "dma_device_type": 1 00:19:46.856 }, 00:19:46.856 { 00:19:46.856 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.856 "dma_device_type": 2 00:19:46.856 } 00:19:46.856 ], 00:19:46.856 "driver_specific": {} 00:19:46.856 } 00:19:46.856 ] 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:47.116 BaseBdev4 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:47.116 13:28:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:47.376 13:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:47.636 [ 00:19:47.636 { 00:19:47.636 "name": "BaseBdev4", 00:19:47.636 "aliases": [ 00:19:47.636 "7f32ef83-9af8-4294-8238-aa2008de7302" 00:19:47.636 ], 00:19:47.636 "product_name": "Malloc disk", 00:19:47.636 "block_size": 512, 00:19:47.636 "num_blocks": 65536, 00:19:47.636 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:47.636 "assigned_rate_limits": { 00:19:47.636 "rw_ios_per_sec": 0, 00:19:47.636 "rw_mbytes_per_sec": 0, 00:19:47.636 "r_mbytes_per_sec": 0, 00:19:47.636 "w_mbytes_per_sec": 0 00:19:47.636 }, 00:19:47.636 "claimed": false, 00:19:47.636 "zoned": false, 00:19:47.636 "supported_io_types": { 00:19:47.636 "read": true, 00:19:47.636 "write": true, 00:19:47.636 "unmap": true, 00:19:47.636 "flush": true, 00:19:47.636 "reset": true, 00:19:47.636 "nvme_admin": false, 00:19:47.636 "nvme_io": false, 00:19:47.636 "nvme_io_md": false, 00:19:47.636 "write_zeroes": true, 00:19:47.636 "zcopy": true, 00:19:47.636 "get_zone_info": false, 00:19:47.636 "zone_management": false, 00:19:47.636 "zone_append": false, 00:19:47.636 "compare": false, 00:19:47.636 "compare_and_write": false, 00:19:47.636 "abort": true, 00:19:47.636 "seek_hole": false, 00:19:47.636 "seek_data": false, 00:19:47.636 "copy": true, 00:19:47.636 "nvme_iov_md": false 00:19:47.636 }, 00:19:47.636 "memory_domains": [ 00:19:47.636 { 00:19:47.636 "dma_device_id": "system", 00:19:47.636 "dma_device_type": 1 00:19:47.636 }, 00:19:47.636 { 00:19:47.636 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.636 "dma_device_type": 2 00:19:47.636 } 00:19:47.636 ], 00:19:47.636 "driver_specific": {} 00:19:47.636 } 00:19:47.636 ] 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:47.636 [2024-07-25 13:28:28.397412] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:47.636 [2024-07-25 13:28:28.397440] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:47.636 [2024-07-25 13:28:28.397452] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:47.636 [2024-07-25 13:28:28.398497] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:47.636 [2024-07-25 13:28:28.398529] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.636 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:47.896 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:47.896 "name": "Existed_Raid", 00:19:47.896 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:47.896 "strip_size_kb": 64, 00:19:47.896 "state": "configuring", 00:19:47.896 "raid_level": "concat", 00:19:47.896 "superblock": true, 00:19:47.896 "num_base_bdevs": 4, 00:19:47.896 "num_base_bdevs_discovered": 3, 00:19:47.896 "num_base_bdevs_operational": 4, 00:19:47.896 "base_bdevs_list": [ 00:19:47.896 { 00:19:47.896 "name": "BaseBdev1", 00:19:47.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:47.896 "is_configured": false, 00:19:47.896 "data_offset": 0, 00:19:47.896 "data_size": 0 00:19:47.896 }, 00:19:47.896 { 00:19:47.896 "name": "BaseBdev2", 00:19:47.896 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:47.896 "is_configured": true, 00:19:47.896 "data_offset": 2048, 00:19:47.896 "data_size": 63488 00:19:47.896 }, 00:19:47.896 { 00:19:47.896 "name": "BaseBdev3", 00:19:47.896 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:47.896 "is_configured": true, 00:19:47.896 "data_offset": 2048, 00:19:47.896 "data_size": 63488 00:19:47.896 }, 00:19:47.896 { 00:19:47.896 "name": "BaseBdev4", 00:19:47.896 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:47.896 "is_configured": true, 00:19:47.896 "data_offset": 2048, 00:19:47.896 "data_size": 63488 00:19:47.896 } 00:19:47.896 ] 00:19:47.896 }' 00:19:47.896 13:28:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:47.896 13:28:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:48.466 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:48.727 [2024-07-25 13:28:29.287636] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.727 "name": "Existed_Raid", 00:19:48.727 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:48.727 "strip_size_kb": 64, 00:19:48.727 "state": "configuring", 00:19:48.727 "raid_level": "concat", 00:19:48.727 "superblock": true, 00:19:48.727 "num_base_bdevs": 4, 00:19:48.727 "num_base_bdevs_discovered": 2, 00:19:48.727 "num_base_bdevs_operational": 4, 00:19:48.727 "base_bdevs_list": [ 00:19:48.727 { 00:19:48.727 "name": "BaseBdev1", 00:19:48.727 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:48.727 "is_configured": false, 00:19:48.727 "data_offset": 0, 00:19:48.727 "data_size": 0 00:19:48.727 }, 00:19:48.727 { 00:19:48.727 "name": null, 00:19:48.727 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:48.727 "is_configured": false, 00:19:48.727 "data_offset": 2048, 00:19:48.727 "data_size": 63488 00:19:48.727 }, 00:19:48.727 { 00:19:48.727 "name": "BaseBdev3", 00:19:48.727 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:48.727 "is_configured": true, 00:19:48.727 "data_offset": 2048, 00:19:48.727 "data_size": 63488 00:19:48.727 }, 00:19:48.727 { 00:19:48.727 "name": "BaseBdev4", 00:19:48.727 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:48.727 "is_configured": true, 00:19:48.727 "data_offset": 2048, 00:19:48.727 "data_size": 63488 00:19:48.727 } 00:19:48.727 ] 00:19:48.727 }' 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.727 13:28:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:49.297 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:49.297 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:49.558 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:49.558 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:49.819 [2024-07-25 13:28:30.411462] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:49.819 BaseBdev1 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:49.819 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:50.080 [ 00:19:50.080 { 00:19:50.080 "name": "BaseBdev1", 00:19:50.080 "aliases": [ 00:19:50.080 "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda" 00:19:50.080 ], 00:19:50.080 "product_name": "Malloc disk", 00:19:50.080 "block_size": 512, 00:19:50.080 "num_blocks": 65536, 00:19:50.080 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:50.080 "assigned_rate_limits": { 00:19:50.080 "rw_ios_per_sec": 0, 00:19:50.080 "rw_mbytes_per_sec": 0, 00:19:50.080 "r_mbytes_per_sec": 0, 00:19:50.080 "w_mbytes_per_sec": 0 00:19:50.080 }, 00:19:50.080 "claimed": true, 00:19:50.080 "claim_type": "exclusive_write", 00:19:50.080 "zoned": false, 00:19:50.080 "supported_io_types": { 00:19:50.080 "read": true, 00:19:50.080 "write": true, 00:19:50.080 "unmap": true, 00:19:50.080 "flush": true, 00:19:50.080 "reset": true, 00:19:50.080 "nvme_admin": false, 00:19:50.080 "nvme_io": false, 00:19:50.080 "nvme_io_md": false, 00:19:50.080 "write_zeroes": true, 00:19:50.080 "zcopy": true, 00:19:50.080 "get_zone_info": false, 00:19:50.080 "zone_management": false, 00:19:50.080 "zone_append": false, 00:19:50.080 "compare": false, 00:19:50.080 "compare_and_write": false, 00:19:50.080 "abort": true, 00:19:50.080 "seek_hole": false, 00:19:50.080 "seek_data": false, 00:19:50.080 "copy": true, 00:19:50.080 "nvme_iov_md": false 00:19:50.080 }, 00:19:50.080 "memory_domains": [ 00:19:50.080 { 00:19:50.080 "dma_device_id": "system", 00:19:50.080 "dma_device_type": 1 00:19:50.080 }, 00:19:50.080 { 00:19:50.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.080 "dma_device_type": 2 00:19:50.080 } 00:19:50.080 ], 00:19:50.080 "driver_specific": {} 00:19:50.080 } 00:19:50.080 ] 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.080 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:50.339 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:50.339 "name": "Existed_Raid", 00:19:50.339 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:50.339 "strip_size_kb": 64, 00:19:50.339 "state": "configuring", 00:19:50.339 "raid_level": "concat", 00:19:50.339 "superblock": true, 00:19:50.339 "num_base_bdevs": 4, 00:19:50.339 "num_base_bdevs_discovered": 3, 00:19:50.339 "num_base_bdevs_operational": 4, 00:19:50.339 "base_bdevs_list": [ 00:19:50.339 { 00:19:50.339 "name": "BaseBdev1", 00:19:50.339 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:50.339 "is_configured": true, 00:19:50.339 "data_offset": 2048, 00:19:50.339 "data_size": 63488 00:19:50.339 }, 00:19:50.339 { 00:19:50.339 "name": null, 00:19:50.339 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:50.339 "is_configured": false, 00:19:50.339 "data_offset": 2048, 00:19:50.339 "data_size": 63488 00:19:50.339 }, 00:19:50.339 { 00:19:50.339 "name": "BaseBdev3", 00:19:50.339 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:50.339 "is_configured": true, 00:19:50.339 "data_offset": 2048, 00:19:50.339 "data_size": 63488 00:19:50.339 }, 00:19:50.339 { 00:19:50.339 "name": "BaseBdev4", 00:19:50.339 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:50.339 "is_configured": true, 00:19:50.339 "data_offset": 2048, 00:19:50.339 "data_size": 63488 00:19:50.339 } 00:19:50.339 ] 00:19:50.339 }' 00:19:50.339 13:28:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:50.339 13:28:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:50.908 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:50.908 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:50.908 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:19:50.908 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:19:51.167 [2024-07-25 13:28:31.875171] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.167 13:28:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:51.427 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:51.428 "name": "Existed_Raid", 00:19:51.428 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:51.428 "strip_size_kb": 64, 00:19:51.428 "state": "configuring", 00:19:51.428 "raid_level": "concat", 00:19:51.428 "superblock": true, 00:19:51.428 "num_base_bdevs": 4, 00:19:51.428 "num_base_bdevs_discovered": 2, 00:19:51.428 "num_base_bdevs_operational": 4, 00:19:51.428 "base_bdevs_list": [ 00:19:51.428 { 00:19:51.428 "name": "BaseBdev1", 00:19:51.428 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:51.428 "is_configured": true, 00:19:51.428 "data_offset": 2048, 00:19:51.428 "data_size": 63488 00:19:51.428 }, 00:19:51.428 { 00:19:51.428 "name": null, 00:19:51.428 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:51.428 "is_configured": false, 00:19:51.428 "data_offset": 2048, 00:19:51.428 "data_size": 63488 00:19:51.428 }, 00:19:51.428 { 00:19:51.428 "name": null, 00:19:51.428 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:51.428 "is_configured": false, 00:19:51.428 "data_offset": 2048, 00:19:51.428 "data_size": 63488 00:19:51.428 }, 00:19:51.428 { 00:19:51.428 "name": "BaseBdev4", 00:19:51.428 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:51.428 "is_configured": true, 00:19:51.428 "data_offset": 2048, 00:19:51.428 "data_size": 63488 00:19:51.428 } 00:19:51.428 ] 00:19:51.428 }' 00:19:51.428 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:51.428 13:28:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:51.997 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.997 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:19:52.258 [2024-07-25 13:28:32.981986] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.258 13:28:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:52.518 13:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.518 "name": "Existed_Raid", 00:19:52.518 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:52.518 "strip_size_kb": 64, 00:19:52.518 "state": "configuring", 00:19:52.518 "raid_level": "concat", 00:19:52.518 "superblock": true, 00:19:52.518 "num_base_bdevs": 4, 00:19:52.518 "num_base_bdevs_discovered": 3, 00:19:52.518 "num_base_bdevs_operational": 4, 00:19:52.518 "base_bdevs_list": [ 00:19:52.518 { 00:19:52.518 "name": "BaseBdev1", 00:19:52.518 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:52.518 "is_configured": true, 00:19:52.518 "data_offset": 2048, 00:19:52.518 "data_size": 63488 00:19:52.518 }, 00:19:52.518 { 00:19:52.518 "name": null, 00:19:52.518 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:52.518 "is_configured": false, 00:19:52.518 "data_offset": 2048, 00:19:52.518 "data_size": 63488 00:19:52.518 }, 00:19:52.518 { 00:19:52.518 "name": "BaseBdev3", 00:19:52.518 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:52.518 "is_configured": true, 00:19:52.518 "data_offset": 2048, 00:19:52.518 "data_size": 63488 00:19:52.518 }, 00:19:52.518 { 00:19:52.518 "name": "BaseBdev4", 00:19:52.518 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:52.518 "is_configured": true, 00:19:52.518 "data_offset": 2048, 00:19:52.518 "data_size": 63488 00:19:52.518 } 00:19:52.518 ] 00:19:52.518 }' 00:19:52.518 13:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.518 13:28:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:53.085 13:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:19:53.085 13:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.345 13:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:19:53.345 13:28:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:53.345 [2024-07-25 13:28:34.104839] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.345 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:53.606 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:53.606 "name": "Existed_Raid", 00:19:53.606 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:53.606 "strip_size_kb": 64, 00:19:53.606 "state": "configuring", 00:19:53.606 "raid_level": "concat", 00:19:53.606 "superblock": true, 00:19:53.606 "num_base_bdevs": 4, 00:19:53.606 "num_base_bdevs_discovered": 2, 00:19:53.606 "num_base_bdevs_operational": 4, 00:19:53.606 "base_bdevs_list": [ 00:19:53.606 { 00:19:53.606 "name": null, 00:19:53.606 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:53.606 "is_configured": false, 00:19:53.606 "data_offset": 2048, 00:19:53.606 "data_size": 63488 00:19:53.606 }, 00:19:53.606 { 00:19:53.606 "name": null, 00:19:53.606 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:53.606 "is_configured": false, 00:19:53.606 "data_offset": 2048, 00:19:53.606 "data_size": 63488 00:19:53.606 }, 00:19:53.606 { 00:19:53.606 "name": "BaseBdev3", 00:19:53.606 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:53.606 "is_configured": true, 00:19:53.606 "data_offset": 2048, 00:19:53.606 "data_size": 63488 00:19:53.606 }, 00:19:53.606 { 00:19:53.606 "name": "BaseBdev4", 00:19:53.606 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:53.606 "is_configured": true, 00:19:53.606 "data_offset": 2048, 00:19:53.606 "data_size": 63488 00:19:53.606 } 00:19:53.606 ] 00:19:53.606 }' 00:19:53.606 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:53.606 13:28:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:54.175 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.175 13:28:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:19:54.434 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:19:54.434 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:19:54.693 [2024-07-25 13:28:35.233519] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:54.693 "name": "Existed_Raid", 00:19:54.693 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:54.693 "strip_size_kb": 64, 00:19:54.693 "state": "configuring", 00:19:54.693 "raid_level": "concat", 00:19:54.693 "superblock": true, 00:19:54.693 "num_base_bdevs": 4, 00:19:54.693 "num_base_bdevs_discovered": 3, 00:19:54.693 "num_base_bdevs_operational": 4, 00:19:54.693 "base_bdevs_list": [ 00:19:54.693 { 00:19:54.693 "name": null, 00:19:54.693 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:54.693 "is_configured": false, 00:19:54.693 "data_offset": 2048, 00:19:54.693 "data_size": 63488 00:19:54.693 }, 00:19:54.693 { 00:19:54.693 "name": "BaseBdev2", 00:19:54.693 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:54.693 "is_configured": true, 00:19:54.693 "data_offset": 2048, 00:19:54.693 "data_size": 63488 00:19:54.693 }, 00:19:54.693 { 00:19:54.693 "name": "BaseBdev3", 00:19:54.693 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:54.693 "is_configured": true, 00:19:54.693 "data_offset": 2048, 00:19:54.693 "data_size": 63488 00:19:54.693 }, 00:19:54.693 { 00:19:54.693 "name": "BaseBdev4", 00:19:54.693 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:54.693 "is_configured": true, 00:19:54.693 "data_offset": 2048, 00:19:54.693 "data_size": 63488 00:19:54.693 } 00:19:54.693 ] 00:19:54.693 }' 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:54.693 13:28:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:55.269 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.269 13:28:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:55.558 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:19:55.558 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:55.558 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u f0cf197f-77a2-4d7f-9ae1-c3215efc2fda 00:19:55.831 [2024-07-25 13:28:36.553926] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:55.831 [2024-07-25 13:28:36.554041] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2482c70 00:19:55.831 [2024-07-25 13:28:36.554049] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:55.831 [2024-07-25 13:28:36.554184] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x24834f0 00:19:55.831 [2024-07-25 13:28:36.554273] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2482c70 00:19:55.831 [2024-07-25 13:28:36.554278] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2482c70 00:19:55.831 [2024-07-25 13:28:36.554346] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:55.831 NewBaseBdev 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:19:55.831 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:56.091 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:56.351 [ 00:19:56.351 { 00:19:56.351 "name": "NewBaseBdev", 00:19:56.351 "aliases": [ 00:19:56.351 "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda" 00:19:56.351 ], 00:19:56.351 "product_name": "Malloc disk", 00:19:56.351 "block_size": 512, 00:19:56.351 "num_blocks": 65536, 00:19:56.351 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:56.351 "assigned_rate_limits": { 00:19:56.351 "rw_ios_per_sec": 0, 00:19:56.351 "rw_mbytes_per_sec": 0, 00:19:56.351 "r_mbytes_per_sec": 0, 00:19:56.351 "w_mbytes_per_sec": 0 00:19:56.351 }, 00:19:56.351 "claimed": true, 00:19:56.351 "claim_type": "exclusive_write", 00:19:56.351 "zoned": false, 00:19:56.351 "supported_io_types": { 00:19:56.351 "read": true, 00:19:56.351 "write": true, 00:19:56.351 "unmap": true, 00:19:56.351 "flush": true, 00:19:56.351 "reset": true, 00:19:56.351 "nvme_admin": false, 00:19:56.351 "nvme_io": false, 00:19:56.351 "nvme_io_md": false, 00:19:56.351 "write_zeroes": true, 00:19:56.351 "zcopy": true, 00:19:56.351 "get_zone_info": false, 00:19:56.351 "zone_management": false, 00:19:56.351 "zone_append": false, 00:19:56.351 "compare": false, 00:19:56.351 "compare_and_write": false, 00:19:56.351 "abort": true, 00:19:56.351 "seek_hole": false, 00:19:56.351 "seek_data": false, 00:19:56.351 "copy": true, 00:19:56.351 "nvme_iov_md": false 00:19:56.351 }, 00:19:56.351 "memory_domains": [ 00:19:56.351 { 00:19:56.351 "dma_device_id": "system", 00:19:56.351 "dma_device_type": 1 00:19:56.351 }, 00:19:56.351 { 00:19:56.351 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.351 "dma_device_type": 2 00:19:56.351 } 00:19:56.351 ], 00:19:56.351 "driver_specific": {} 00:19:56.351 } 00:19:56.351 ] 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.351 13:28:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:56.610 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:56.610 "name": "Existed_Raid", 00:19:56.610 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:56.610 "strip_size_kb": 64, 00:19:56.610 "state": "online", 00:19:56.610 "raid_level": "concat", 00:19:56.610 "superblock": true, 00:19:56.610 "num_base_bdevs": 4, 00:19:56.610 "num_base_bdevs_discovered": 4, 00:19:56.610 "num_base_bdevs_operational": 4, 00:19:56.610 "base_bdevs_list": [ 00:19:56.610 { 00:19:56.610 "name": "NewBaseBdev", 00:19:56.610 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:56.610 "is_configured": true, 00:19:56.610 "data_offset": 2048, 00:19:56.610 "data_size": 63488 00:19:56.610 }, 00:19:56.610 { 00:19:56.610 "name": "BaseBdev2", 00:19:56.610 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:56.610 "is_configured": true, 00:19:56.610 "data_offset": 2048, 00:19:56.610 "data_size": 63488 00:19:56.610 }, 00:19:56.610 { 00:19:56.610 "name": "BaseBdev3", 00:19:56.610 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:56.610 "is_configured": true, 00:19:56.610 "data_offset": 2048, 00:19:56.610 "data_size": 63488 00:19:56.610 }, 00:19:56.610 { 00:19:56.610 "name": "BaseBdev4", 00:19:56.610 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:56.610 "is_configured": true, 00:19:56.610 "data_offset": 2048, 00:19:56.610 "data_size": 63488 00:19:56.610 } 00:19:56.610 ] 00:19:56.610 }' 00:19:56.610 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:56.610 13:28:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:57.179 [2024-07-25 13:28:37.873531] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:57.179 "name": "Existed_Raid", 00:19:57.179 "aliases": [ 00:19:57.179 "0690ba4a-d689-4c2e-8f20-55ce721507f4" 00:19:57.179 ], 00:19:57.179 "product_name": "Raid Volume", 00:19:57.179 "block_size": 512, 00:19:57.179 "num_blocks": 253952, 00:19:57.179 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:57.179 "assigned_rate_limits": { 00:19:57.179 "rw_ios_per_sec": 0, 00:19:57.179 "rw_mbytes_per_sec": 0, 00:19:57.179 "r_mbytes_per_sec": 0, 00:19:57.179 "w_mbytes_per_sec": 0 00:19:57.179 }, 00:19:57.179 "claimed": false, 00:19:57.179 "zoned": false, 00:19:57.179 "supported_io_types": { 00:19:57.179 "read": true, 00:19:57.179 "write": true, 00:19:57.179 "unmap": true, 00:19:57.179 "flush": true, 00:19:57.179 "reset": true, 00:19:57.179 "nvme_admin": false, 00:19:57.179 "nvme_io": false, 00:19:57.179 "nvme_io_md": false, 00:19:57.179 "write_zeroes": true, 00:19:57.179 "zcopy": false, 00:19:57.179 "get_zone_info": false, 00:19:57.179 "zone_management": false, 00:19:57.179 "zone_append": false, 00:19:57.179 "compare": false, 00:19:57.179 "compare_and_write": false, 00:19:57.179 "abort": false, 00:19:57.179 "seek_hole": false, 00:19:57.179 "seek_data": false, 00:19:57.179 "copy": false, 00:19:57.179 "nvme_iov_md": false 00:19:57.179 }, 00:19:57.179 "memory_domains": [ 00:19:57.179 { 00:19:57.179 "dma_device_id": "system", 00:19:57.179 "dma_device_type": 1 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.179 "dma_device_type": 2 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "dma_device_id": "system", 00:19:57.179 "dma_device_type": 1 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.179 "dma_device_type": 2 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "dma_device_id": "system", 00:19:57.179 "dma_device_type": 1 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.179 "dma_device_type": 2 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "dma_device_id": "system", 00:19:57.179 "dma_device_type": 1 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.179 "dma_device_type": 2 00:19:57.179 } 00:19:57.179 ], 00:19:57.179 "driver_specific": { 00:19:57.179 "raid": { 00:19:57.179 "uuid": "0690ba4a-d689-4c2e-8f20-55ce721507f4", 00:19:57.179 "strip_size_kb": 64, 00:19:57.179 "state": "online", 00:19:57.179 "raid_level": "concat", 00:19:57.179 "superblock": true, 00:19:57.179 "num_base_bdevs": 4, 00:19:57.179 "num_base_bdevs_discovered": 4, 00:19:57.179 "num_base_bdevs_operational": 4, 00:19:57.179 "base_bdevs_list": [ 00:19:57.179 { 00:19:57.179 "name": "NewBaseBdev", 00:19:57.179 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:57.179 "is_configured": true, 00:19:57.179 "data_offset": 2048, 00:19:57.179 "data_size": 63488 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "name": "BaseBdev2", 00:19:57.179 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:57.179 "is_configured": true, 00:19:57.179 "data_offset": 2048, 00:19:57.179 "data_size": 63488 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "name": "BaseBdev3", 00:19:57.179 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:57.179 "is_configured": true, 00:19:57.179 "data_offset": 2048, 00:19:57.179 "data_size": 63488 00:19:57.179 }, 00:19:57.179 { 00:19:57.179 "name": "BaseBdev4", 00:19:57.179 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:57.179 "is_configured": true, 00:19:57.179 "data_offset": 2048, 00:19:57.179 "data_size": 63488 00:19:57.179 } 00:19:57.179 ] 00:19:57.179 } 00:19:57.179 } 00:19:57.179 }' 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:57.179 BaseBdev2 00:19:57.179 BaseBdev3 00:19:57.179 BaseBdev4' 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:57.179 13:28:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:57.439 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:57.439 "name": "NewBaseBdev", 00:19:57.439 "aliases": [ 00:19:57.439 "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda" 00:19:57.439 ], 00:19:57.439 "product_name": "Malloc disk", 00:19:57.439 "block_size": 512, 00:19:57.439 "num_blocks": 65536, 00:19:57.439 "uuid": "f0cf197f-77a2-4d7f-9ae1-c3215efc2fda", 00:19:57.439 "assigned_rate_limits": { 00:19:57.439 "rw_ios_per_sec": 0, 00:19:57.439 "rw_mbytes_per_sec": 0, 00:19:57.439 "r_mbytes_per_sec": 0, 00:19:57.439 "w_mbytes_per_sec": 0 00:19:57.439 }, 00:19:57.439 "claimed": true, 00:19:57.439 "claim_type": "exclusive_write", 00:19:57.439 "zoned": false, 00:19:57.439 "supported_io_types": { 00:19:57.439 "read": true, 00:19:57.439 "write": true, 00:19:57.439 "unmap": true, 00:19:57.439 "flush": true, 00:19:57.439 "reset": true, 00:19:57.439 "nvme_admin": false, 00:19:57.439 "nvme_io": false, 00:19:57.439 "nvme_io_md": false, 00:19:57.439 "write_zeroes": true, 00:19:57.439 "zcopy": true, 00:19:57.439 "get_zone_info": false, 00:19:57.439 "zone_management": false, 00:19:57.439 "zone_append": false, 00:19:57.439 "compare": false, 00:19:57.439 "compare_and_write": false, 00:19:57.439 "abort": true, 00:19:57.439 "seek_hole": false, 00:19:57.439 "seek_data": false, 00:19:57.439 "copy": true, 00:19:57.439 "nvme_iov_md": false 00:19:57.439 }, 00:19:57.439 "memory_domains": [ 00:19:57.439 { 00:19:57.439 "dma_device_id": "system", 00:19:57.439 "dma_device_type": 1 00:19:57.439 }, 00:19:57.439 { 00:19:57.439 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.439 "dma_device_type": 2 00:19:57.439 } 00:19:57.439 ], 00:19:57.439 "driver_specific": {} 00:19:57.439 }' 00:19:57.439 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.439 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.439 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:57.439 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:57.699 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:57.958 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:57.958 "name": "BaseBdev2", 00:19:57.958 "aliases": [ 00:19:57.958 "2862607f-e67d-48cd-96c2-61a913c21ea7" 00:19:57.958 ], 00:19:57.958 "product_name": "Malloc disk", 00:19:57.958 "block_size": 512, 00:19:57.958 "num_blocks": 65536, 00:19:57.958 "uuid": "2862607f-e67d-48cd-96c2-61a913c21ea7", 00:19:57.958 "assigned_rate_limits": { 00:19:57.958 "rw_ios_per_sec": 0, 00:19:57.958 "rw_mbytes_per_sec": 0, 00:19:57.958 "r_mbytes_per_sec": 0, 00:19:57.958 "w_mbytes_per_sec": 0 00:19:57.958 }, 00:19:57.958 "claimed": true, 00:19:57.958 "claim_type": "exclusive_write", 00:19:57.958 "zoned": false, 00:19:57.958 "supported_io_types": { 00:19:57.958 "read": true, 00:19:57.958 "write": true, 00:19:57.958 "unmap": true, 00:19:57.958 "flush": true, 00:19:57.958 "reset": true, 00:19:57.958 "nvme_admin": false, 00:19:57.958 "nvme_io": false, 00:19:57.958 "nvme_io_md": false, 00:19:57.958 "write_zeroes": true, 00:19:57.958 "zcopy": true, 00:19:57.959 "get_zone_info": false, 00:19:57.959 "zone_management": false, 00:19:57.959 "zone_append": false, 00:19:57.959 "compare": false, 00:19:57.959 "compare_and_write": false, 00:19:57.959 "abort": true, 00:19:57.959 "seek_hole": false, 00:19:57.959 "seek_data": false, 00:19:57.959 "copy": true, 00:19:57.959 "nvme_iov_md": false 00:19:57.959 }, 00:19:57.959 "memory_domains": [ 00:19:57.959 { 00:19:57.959 "dma_device_id": "system", 00:19:57.959 "dma_device_type": 1 00:19:57.959 }, 00:19:57.959 { 00:19:57.959 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:57.959 "dma_device_type": 2 00:19:57.959 } 00:19:57.959 ], 00:19:57.959 "driver_specific": {} 00:19:57.959 }' 00:19:57.959 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:57.959 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.218 13:28:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.477 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:58.477 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:58.477 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:58.477 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:58.477 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:58.477 "name": "BaseBdev3", 00:19:58.477 "aliases": [ 00:19:58.477 "5a887f2a-0141-4dfe-a42f-3ebfb43826c1" 00:19:58.477 ], 00:19:58.477 "product_name": "Malloc disk", 00:19:58.477 "block_size": 512, 00:19:58.477 "num_blocks": 65536, 00:19:58.477 "uuid": "5a887f2a-0141-4dfe-a42f-3ebfb43826c1", 00:19:58.477 "assigned_rate_limits": { 00:19:58.477 "rw_ios_per_sec": 0, 00:19:58.477 "rw_mbytes_per_sec": 0, 00:19:58.477 "r_mbytes_per_sec": 0, 00:19:58.477 "w_mbytes_per_sec": 0 00:19:58.477 }, 00:19:58.477 "claimed": true, 00:19:58.477 "claim_type": "exclusive_write", 00:19:58.477 "zoned": false, 00:19:58.477 "supported_io_types": { 00:19:58.477 "read": true, 00:19:58.477 "write": true, 00:19:58.477 "unmap": true, 00:19:58.477 "flush": true, 00:19:58.477 "reset": true, 00:19:58.477 "nvme_admin": false, 00:19:58.477 "nvme_io": false, 00:19:58.477 "nvme_io_md": false, 00:19:58.477 "write_zeroes": true, 00:19:58.477 "zcopy": true, 00:19:58.477 "get_zone_info": false, 00:19:58.477 "zone_management": false, 00:19:58.477 "zone_append": false, 00:19:58.477 "compare": false, 00:19:58.477 "compare_and_write": false, 00:19:58.477 "abort": true, 00:19:58.477 "seek_hole": false, 00:19:58.477 "seek_data": false, 00:19:58.477 "copy": true, 00:19:58.477 "nvme_iov_md": false 00:19:58.477 }, 00:19:58.477 "memory_domains": [ 00:19:58.477 { 00:19:58.477 "dma_device_id": "system", 00:19:58.477 "dma_device_type": 1 00:19:58.477 }, 00:19:58.477 { 00:19:58.477 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.477 "dma_device_type": 2 00:19:58.477 } 00:19:58.477 ], 00:19:58.477 "driver_specific": {} 00:19:58.477 }' 00:19:58.477 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.736 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:58.996 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:58.996 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:58.996 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:58.996 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:58.996 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:58.996 "name": "BaseBdev4", 00:19:58.996 "aliases": [ 00:19:58.996 "7f32ef83-9af8-4294-8238-aa2008de7302" 00:19:58.996 ], 00:19:58.996 "product_name": "Malloc disk", 00:19:58.996 "block_size": 512, 00:19:58.996 "num_blocks": 65536, 00:19:58.996 "uuid": "7f32ef83-9af8-4294-8238-aa2008de7302", 00:19:58.996 "assigned_rate_limits": { 00:19:58.996 "rw_ios_per_sec": 0, 00:19:58.996 "rw_mbytes_per_sec": 0, 00:19:58.996 "r_mbytes_per_sec": 0, 00:19:58.996 "w_mbytes_per_sec": 0 00:19:58.996 }, 00:19:58.996 "claimed": true, 00:19:58.996 "claim_type": "exclusive_write", 00:19:58.996 "zoned": false, 00:19:58.996 "supported_io_types": { 00:19:58.996 "read": true, 00:19:58.996 "write": true, 00:19:58.996 "unmap": true, 00:19:58.996 "flush": true, 00:19:58.996 "reset": true, 00:19:58.996 "nvme_admin": false, 00:19:58.996 "nvme_io": false, 00:19:58.996 "nvme_io_md": false, 00:19:58.996 "write_zeroes": true, 00:19:58.996 "zcopy": true, 00:19:58.996 "get_zone_info": false, 00:19:58.996 "zone_management": false, 00:19:58.996 "zone_append": false, 00:19:58.996 "compare": false, 00:19:58.996 "compare_and_write": false, 00:19:58.996 "abort": true, 00:19:58.996 "seek_hole": false, 00:19:58.996 "seek_data": false, 00:19:58.996 "copy": true, 00:19:58.996 "nvme_iov_md": false 00:19:58.996 }, 00:19:58.996 "memory_domains": [ 00:19:58.996 { 00:19:58.996 "dma_device_id": "system", 00:19:58.996 "dma_device_type": 1 00:19:58.996 }, 00:19:58.996 { 00:19:58.996 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:58.996 "dma_device_type": 2 00:19:58.996 } 00:19:58.996 ], 00:19:58.996 "driver_specific": {} 00:19:58.996 }' 00:19:58.996 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:59.255 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:59.255 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:59.255 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:59.255 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:59.255 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:59.255 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:59.255 13:28:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:59.255 13:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:59.255 13:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:59.255 13:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:59.514 13:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:59.514 13:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:59.514 [2024-07-25 13:28:40.263650] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:59.514 [2024-07-25 13:28:40.263670] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:59.514 [2024-07-25 13:28:40.263710] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:59.514 [2024-07-25 13:28:40.263754] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:59.514 [2024-07-25 13:28:40.263761] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2482c70 name Existed_Raid, state offline 00:19:59.514 13:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 965862 00:19:59.514 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 965862 ']' 00:19:59.515 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 965862 00:19:59.515 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:19:59.515 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:19:59.515 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 965862 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 965862' 00:19:59.774 killing process with pid 965862 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 965862 00:19:59.774 [2024-07-25 13:28:40.330749] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 965862 00:19:59.774 [2024-07-25 13:28:40.351371] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:59.774 00:19:59.774 real 0m27.546s 00:19:59.774 user 0m51.629s 00:19:59.774 sys 0m4.033s 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:19:59.774 13:28:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:59.774 ************************************ 00:19:59.774 END TEST raid_state_function_test_sb 00:19:59.774 ************************************ 00:19:59.774 13:28:40 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:19:59.774 13:28:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:19:59.774 13:28:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:19:59.774 13:28:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:59.774 ************************************ 00:19:59.774 START TEST raid_superblock_test 00:19:59.774 ************************************ 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test concat 4 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=concat 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' concat '!=' raid1 ']' 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # strip_size=64 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # strip_size_create_arg='-z 64' 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=971127 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 971127 /var/tmp/spdk-raid.sock 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 971127 ']' 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:59.774 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:19:59.774 13:28:40 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.033 [2024-07-25 13:28:40.608118] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:20:00.033 [2024-07-25 13:28:40.608171] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid971127 ] 00:20:00.033 [2024-07-25 13:28:40.697892] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:00.033 [2024-07-25 13:28:40.764519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:00.033 [2024-07-25 13:28:40.814667] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:00.033 [2024-07-25 13:28:40.814691] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:00.972 13:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:00.972 13:28:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:00.973 malloc1 00:20:00.973 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:01.233 [2024-07-25 13:28:41.817657] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:01.233 [2024-07-25 13:28:41.817691] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.233 [2024-07-25 13:28:41.817702] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11189b0 00:20:01.233 [2024-07-25 13:28:41.817708] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.233 [2024-07-25 13:28:41.819009] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.233 [2024-07-25 13:28:41.819029] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:01.233 pt1 00:20:01.233 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:20:01.233 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:20:01.234 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:20:01.234 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:20:01.234 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:01.234 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:01.234 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:20:01.234 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:01.234 13:28:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:01.234 malloc2 00:20:01.234 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:01.493 [2024-07-25 13:28:42.188511] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:01.493 [2024-07-25 13:28:42.188541] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:01.493 [2024-07-25 13:28:42.188553] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1119db0 00:20:01.493 [2024-07-25 13:28:42.188560] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:01.493 [2024-07-25 13:28:42.189770] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:01.493 [2024-07-25 13:28:42.189789] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:01.493 pt2 00:20:01.493 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:20:01.493 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:20:01.493 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:20:01.493 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:20:01.493 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:01.493 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:01.494 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:20:01.494 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:01.494 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:01.753 malloc3 00:20:01.753 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:02.013 [2024-07-25 13:28:42.571423] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:02.013 [2024-07-25 13:28:42.571450] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.013 [2024-07-25 13:28:42.571459] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b0780 00:20:02.013 [2024-07-25 13:28:42.571465] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.013 [2024-07-25 13:28:42.572646] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.013 [2024-07-25 13:28:42.572665] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:02.013 pt3 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:02.013 malloc4 00:20:02.013 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:02.273 [2024-07-25 13:28:42.958362] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:02.273 [2024-07-25 13:28:42.958390] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:02.273 [2024-07-25 13:28:42.958400] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12b30a0 00:20:02.273 [2024-07-25 13:28:42.958406] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:02.273 [2024-07-25 13:28:42.959595] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:02.273 [2024-07-25 13:28:42.959613] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:02.273 pt4 00:20:02.273 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:20:02.273 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:20:02.273 13:28:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:02.533 [2024-07-25 13:28:43.150877] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:02.533 [2024-07-25 13:28:43.151885] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:02.533 [2024-07-25 13:28:43.151932] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:02.533 [2024-07-25 13:28:43.151966] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:02.533 [2024-07-25 13:28:43.152087] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1110970 00:20:02.533 [2024-07-25 13:28:43.152094] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:02.533 [2024-07-25 13:28:43.152251] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x110ebb0 00:20:02.533 [2024-07-25 13:28:43.152359] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1110970 00:20:02.533 [2024-07-25 13:28:43.152365] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1110970 00:20:02.533 [2024-07-25 13:28:43.152445] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.533 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:02.792 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.792 "name": "raid_bdev1", 00:20:02.792 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:02.792 "strip_size_kb": 64, 00:20:02.792 "state": "online", 00:20:02.792 "raid_level": "concat", 00:20:02.792 "superblock": true, 00:20:02.792 "num_base_bdevs": 4, 00:20:02.792 "num_base_bdevs_discovered": 4, 00:20:02.792 "num_base_bdevs_operational": 4, 00:20:02.792 "base_bdevs_list": [ 00:20:02.792 { 00:20:02.792 "name": "pt1", 00:20:02.792 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:02.792 "is_configured": true, 00:20:02.792 "data_offset": 2048, 00:20:02.792 "data_size": 63488 00:20:02.792 }, 00:20:02.792 { 00:20:02.792 "name": "pt2", 00:20:02.792 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:02.792 "is_configured": true, 00:20:02.792 "data_offset": 2048, 00:20:02.792 "data_size": 63488 00:20:02.792 }, 00:20:02.792 { 00:20:02.792 "name": "pt3", 00:20:02.792 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:02.792 "is_configured": true, 00:20:02.792 "data_offset": 2048, 00:20:02.792 "data_size": 63488 00:20:02.792 }, 00:20:02.792 { 00:20:02.792 "name": "pt4", 00:20:02.792 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:02.792 "is_configured": true, 00:20:02.792 "data_offset": 2048, 00:20:02.792 "data_size": 63488 00:20:02.792 } 00:20:02.792 ] 00:20:02.792 }' 00:20:02.792 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.792 13:28:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:03.392 13:28:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:03.392 [2024-07-25 13:28:44.077444] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:03.392 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:03.392 "name": "raid_bdev1", 00:20:03.392 "aliases": [ 00:20:03.392 "064579f7-756b-4309-8d08-f1bca32ccbfe" 00:20:03.392 ], 00:20:03.392 "product_name": "Raid Volume", 00:20:03.392 "block_size": 512, 00:20:03.392 "num_blocks": 253952, 00:20:03.392 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:03.392 "assigned_rate_limits": { 00:20:03.392 "rw_ios_per_sec": 0, 00:20:03.392 "rw_mbytes_per_sec": 0, 00:20:03.392 "r_mbytes_per_sec": 0, 00:20:03.392 "w_mbytes_per_sec": 0 00:20:03.392 }, 00:20:03.392 "claimed": false, 00:20:03.392 "zoned": false, 00:20:03.392 "supported_io_types": { 00:20:03.392 "read": true, 00:20:03.392 "write": true, 00:20:03.392 "unmap": true, 00:20:03.392 "flush": true, 00:20:03.392 "reset": true, 00:20:03.392 "nvme_admin": false, 00:20:03.392 "nvme_io": false, 00:20:03.393 "nvme_io_md": false, 00:20:03.393 "write_zeroes": true, 00:20:03.393 "zcopy": false, 00:20:03.393 "get_zone_info": false, 00:20:03.393 "zone_management": false, 00:20:03.393 "zone_append": false, 00:20:03.393 "compare": false, 00:20:03.393 "compare_and_write": false, 00:20:03.393 "abort": false, 00:20:03.393 "seek_hole": false, 00:20:03.393 "seek_data": false, 00:20:03.393 "copy": false, 00:20:03.393 "nvme_iov_md": false 00:20:03.393 }, 00:20:03.393 "memory_domains": [ 00:20:03.393 { 00:20:03.393 "dma_device_id": "system", 00:20:03.393 "dma_device_type": 1 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.393 "dma_device_type": 2 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "dma_device_id": "system", 00:20:03.393 "dma_device_type": 1 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.393 "dma_device_type": 2 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "dma_device_id": "system", 00:20:03.393 "dma_device_type": 1 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.393 "dma_device_type": 2 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "dma_device_id": "system", 00:20:03.393 "dma_device_type": 1 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.393 "dma_device_type": 2 00:20:03.393 } 00:20:03.393 ], 00:20:03.393 "driver_specific": { 00:20:03.393 "raid": { 00:20:03.393 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:03.393 "strip_size_kb": 64, 00:20:03.393 "state": "online", 00:20:03.393 "raid_level": "concat", 00:20:03.393 "superblock": true, 00:20:03.393 "num_base_bdevs": 4, 00:20:03.393 "num_base_bdevs_discovered": 4, 00:20:03.393 "num_base_bdevs_operational": 4, 00:20:03.393 "base_bdevs_list": [ 00:20:03.393 { 00:20:03.393 "name": "pt1", 00:20:03.393 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:03.393 "is_configured": true, 00:20:03.393 "data_offset": 2048, 00:20:03.393 "data_size": 63488 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "name": "pt2", 00:20:03.393 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:03.393 "is_configured": true, 00:20:03.393 "data_offset": 2048, 00:20:03.393 "data_size": 63488 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "name": "pt3", 00:20:03.393 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:03.393 "is_configured": true, 00:20:03.393 "data_offset": 2048, 00:20:03.393 "data_size": 63488 00:20:03.393 }, 00:20:03.393 { 00:20:03.393 "name": "pt4", 00:20:03.393 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:03.393 "is_configured": true, 00:20:03.393 "data_offset": 2048, 00:20:03.393 "data_size": 63488 00:20:03.393 } 00:20:03.393 ] 00:20:03.393 } 00:20:03.393 } 00:20:03.393 }' 00:20:03.393 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:03.393 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:03.393 pt2 00:20:03.393 pt3 00:20:03.393 pt4' 00:20:03.393 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:03.393 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:03.393 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:03.652 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:03.652 "name": "pt1", 00:20:03.652 "aliases": [ 00:20:03.652 "00000000-0000-0000-0000-000000000001" 00:20:03.652 ], 00:20:03.652 "product_name": "passthru", 00:20:03.652 "block_size": 512, 00:20:03.652 "num_blocks": 65536, 00:20:03.652 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:03.652 "assigned_rate_limits": { 00:20:03.652 "rw_ios_per_sec": 0, 00:20:03.652 "rw_mbytes_per_sec": 0, 00:20:03.652 "r_mbytes_per_sec": 0, 00:20:03.652 "w_mbytes_per_sec": 0 00:20:03.652 }, 00:20:03.652 "claimed": true, 00:20:03.652 "claim_type": "exclusive_write", 00:20:03.652 "zoned": false, 00:20:03.652 "supported_io_types": { 00:20:03.652 "read": true, 00:20:03.652 "write": true, 00:20:03.652 "unmap": true, 00:20:03.652 "flush": true, 00:20:03.652 "reset": true, 00:20:03.652 "nvme_admin": false, 00:20:03.652 "nvme_io": false, 00:20:03.652 "nvme_io_md": false, 00:20:03.652 "write_zeroes": true, 00:20:03.652 "zcopy": true, 00:20:03.652 "get_zone_info": false, 00:20:03.652 "zone_management": false, 00:20:03.652 "zone_append": false, 00:20:03.652 "compare": false, 00:20:03.652 "compare_and_write": false, 00:20:03.652 "abort": true, 00:20:03.653 "seek_hole": false, 00:20:03.653 "seek_data": false, 00:20:03.653 "copy": true, 00:20:03.653 "nvme_iov_md": false 00:20:03.653 }, 00:20:03.653 "memory_domains": [ 00:20:03.653 { 00:20:03.653 "dma_device_id": "system", 00:20:03.653 "dma_device_type": 1 00:20:03.653 }, 00:20:03.653 { 00:20:03.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:03.653 "dma_device_type": 2 00:20:03.653 } 00:20:03.653 ], 00:20:03.653 "driver_specific": { 00:20:03.653 "passthru": { 00:20:03.653 "name": "pt1", 00:20:03.653 "base_bdev_name": "malloc1" 00:20:03.653 } 00:20:03.653 } 00:20:03.653 }' 00:20:03.653 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:03.653 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:03.653 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:03.653 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:03.912 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:04.172 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:04.172 "name": "pt2", 00:20:04.172 "aliases": [ 00:20:04.172 "00000000-0000-0000-0000-000000000002" 00:20:04.172 ], 00:20:04.172 "product_name": "passthru", 00:20:04.172 "block_size": 512, 00:20:04.172 "num_blocks": 65536, 00:20:04.172 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:04.172 "assigned_rate_limits": { 00:20:04.172 "rw_ios_per_sec": 0, 00:20:04.172 "rw_mbytes_per_sec": 0, 00:20:04.172 "r_mbytes_per_sec": 0, 00:20:04.172 "w_mbytes_per_sec": 0 00:20:04.172 }, 00:20:04.172 "claimed": true, 00:20:04.172 "claim_type": "exclusive_write", 00:20:04.172 "zoned": false, 00:20:04.172 "supported_io_types": { 00:20:04.172 "read": true, 00:20:04.172 "write": true, 00:20:04.172 "unmap": true, 00:20:04.172 "flush": true, 00:20:04.172 "reset": true, 00:20:04.172 "nvme_admin": false, 00:20:04.172 "nvme_io": false, 00:20:04.172 "nvme_io_md": false, 00:20:04.172 "write_zeroes": true, 00:20:04.172 "zcopy": true, 00:20:04.172 "get_zone_info": false, 00:20:04.172 "zone_management": false, 00:20:04.172 "zone_append": false, 00:20:04.172 "compare": false, 00:20:04.172 "compare_and_write": false, 00:20:04.172 "abort": true, 00:20:04.172 "seek_hole": false, 00:20:04.172 "seek_data": false, 00:20:04.172 "copy": true, 00:20:04.172 "nvme_iov_md": false 00:20:04.172 }, 00:20:04.172 "memory_domains": [ 00:20:04.172 { 00:20:04.172 "dma_device_id": "system", 00:20:04.172 "dma_device_type": 1 00:20:04.172 }, 00:20:04.172 { 00:20:04.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.172 "dma_device_type": 2 00:20:04.172 } 00:20:04.172 ], 00:20:04.172 "driver_specific": { 00:20:04.172 "passthru": { 00:20:04.172 "name": "pt2", 00:20:04.172 "base_bdev_name": "malloc2" 00:20:04.172 } 00:20:04.172 } 00:20:04.172 }' 00:20:04.172 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.172 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.172 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.172 13:28:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.432 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.432 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:04.432 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.432 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.432 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:04.432 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.432 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.691 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:04.691 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:04.691 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:04.691 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:04.691 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:04.691 "name": "pt3", 00:20:04.691 "aliases": [ 00:20:04.691 "00000000-0000-0000-0000-000000000003" 00:20:04.691 ], 00:20:04.691 "product_name": "passthru", 00:20:04.691 "block_size": 512, 00:20:04.692 "num_blocks": 65536, 00:20:04.692 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:04.692 "assigned_rate_limits": { 00:20:04.692 "rw_ios_per_sec": 0, 00:20:04.692 "rw_mbytes_per_sec": 0, 00:20:04.692 "r_mbytes_per_sec": 0, 00:20:04.692 "w_mbytes_per_sec": 0 00:20:04.692 }, 00:20:04.692 "claimed": true, 00:20:04.692 "claim_type": "exclusive_write", 00:20:04.692 "zoned": false, 00:20:04.692 "supported_io_types": { 00:20:04.692 "read": true, 00:20:04.692 "write": true, 00:20:04.692 "unmap": true, 00:20:04.692 "flush": true, 00:20:04.692 "reset": true, 00:20:04.692 "nvme_admin": false, 00:20:04.692 "nvme_io": false, 00:20:04.692 "nvme_io_md": false, 00:20:04.692 "write_zeroes": true, 00:20:04.692 "zcopy": true, 00:20:04.692 "get_zone_info": false, 00:20:04.692 "zone_management": false, 00:20:04.692 "zone_append": false, 00:20:04.692 "compare": false, 00:20:04.692 "compare_and_write": false, 00:20:04.692 "abort": true, 00:20:04.692 "seek_hole": false, 00:20:04.692 "seek_data": false, 00:20:04.692 "copy": true, 00:20:04.692 "nvme_iov_md": false 00:20:04.692 }, 00:20:04.692 "memory_domains": [ 00:20:04.692 { 00:20:04.692 "dma_device_id": "system", 00:20:04.692 "dma_device_type": 1 00:20:04.692 }, 00:20:04.692 { 00:20:04.692 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:04.692 "dma_device_type": 2 00:20:04.692 } 00:20:04.692 ], 00:20:04.692 "driver_specific": { 00:20:04.692 "passthru": { 00:20:04.692 "name": "pt3", 00:20:04.692 "base_bdev_name": "malloc3" 00:20:04.692 } 00:20:04.692 } 00:20:04.692 }' 00:20:04.692 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.692 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:04.951 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.210 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:05.210 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:05.210 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:05.210 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:05.210 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:05.210 "name": "pt4", 00:20:05.210 "aliases": [ 00:20:05.210 "00000000-0000-0000-0000-000000000004" 00:20:05.210 ], 00:20:05.210 "product_name": "passthru", 00:20:05.210 "block_size": 512, 00:20:05.210 "num_blocks": 65536, 00:20:05.210 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:05.210 "assigned_rate_limits": { 00:20:05.210 "rw_ios_per_sec": 0, 00:20:05.210 "rw_mbytes_per_sec": 0, 00:20:05.210 "r_mbytes_per_sec": 0, 00:20:05.210 "w_mbytes_per_sec": 0 00:20:05.210 }, 00:20:05.210 "claimed": true, 00:20:05.210 "claim_type": "exclusive_write", 00:20:05.210 "zoned": false, 00:20:05.210 "supported_io_types": { 00:20:05.210 "read": true, 00:20:05.210 "write": true, 00:20:05.210 "unmap": true, 00:20:05.210 "flush": true, 00:20:05.210 "reset": true, 00:20:05.210 "nvme_admin": false, 00:20:05.210 "nvme_io": false, 00:20:05.210 "nvme_io_md": false, 00:20:05.210 "write_zeroes": true, 00:20:05.210 "zcopy": true, 00:20:05.210 "get_zone_info": false, 00:20:05.210 "zone_management": false, 00:20:05.210 "zone_append": false, 00:20:05.210 "compare": false, 00:20:05.210 "compare_and_write": false, 00:20:05.210 "abort": true, 00:20:05.210 "seek_hole": false, 00:20:05.210 "seek_data": false, 00:20:05.210 "copy": true, 00:20:05.210 "nvme_iov_md": false 00:20:05.210 }, 00:20:05.210 "memory_domains": [ 00:20:05.210 { 00:20:05.211 "dma_device_id": "system", 00:20:05.211 "dma_device_type": 1 00:20:05.211 }, 00:20:05.211 { 00:20:05.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:05.211 "dma_device_type": 2 00:20:05.211 } 00:20:05.211 ], 00:20:05.211 "driver_specific": { 00:20:05.211 "passthru": { 00:20:05.211 "name": "pt4", 00:20:05.211 "base_bdev_name": "malloc4" 00:20:05.211 } 00:20:05.211 } 00:20:05.211 }' 00:20:05.211 13:28:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:05.470 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.729 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:05.729 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:05.729 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:05.729 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:20:05.729 [2024-07-25 13:28:46.479513] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:05.729 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=064579f7-756b-4309-8d08-f1bca32ccbfe 00:20:05.729 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z 064579f7-756b-4309-8d08-f1bca32ccbfe ']' 00:20:05.729 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:05.990 [2024-07-25 13:28:46.671754] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:05.990 [2024-07-25 13:28:46.671770] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:05.990 [2024-07-25 13:28:46.671805] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:05.990 [2024-07-25 13:28:46.671853] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:05.990 [2024-07-25 13:28:46.671859] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1110970 name raid_bdev1, state offline 00:20:05.990 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.990 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:20:06.249 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:20:06.249 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:20:06.249 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:20:06.249 13:28:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:06.509 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:20:06.509 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:06.509 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:20:06.509 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:06.768 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:20:06.768 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:07.027 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:07.027 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:07.286 13:28:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:07.286 [2024-07-25 13:28:48.015110] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:07.286 [2024-07-25 13:28:48.016189] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:07.286 [2024-07-25 13:28:48.016223] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:07.286 [2024-07-25 13:28:48.016249] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:07.287 [2024-07-25 13:28:48.016282] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:07.287 [2024-07-25 13:28:48.016308] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:07.287 [2024-07-25 13:28:48.016322] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:07.287 [2024-07-25 13:28:48.016335] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:07.287 [2024-07-25 13:28:48.016350] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:07.287 [2024-07-25 13:28:48.016356] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1118e50 name raid_bdev1, state configuring 00:20:07.287 request: 00:20:07.287 { 00:20:07.287 "name": "raid_bdev1", 00:20:07.287 "raid_level": "concat", 00:20:07.287 "base_bdevs": [ 00:20:07.287 "malloc1", 00:20:07.287 "malloc2", 00:20:07.287 "malloc3", 00:20:07.287 "malloc4" 00:20:07.287 ], 00:20:07.287 "strip_size_kb": 64, 00:20:07.287 "superblock": false, 00:20:07.287 "method": "bdev_raid_create", 00:20:07.287 "req_id": 1 00:20:07.287 } 00:20:07.287 Got JSON-RPC error response 00:20:07.287 response: 00:20:07.287 { 00:20:07.287 "code": -17, 00:20:07.287 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:07.287 } 00:20:07.287 13:28:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:20:07.287 13:28:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:20:07.287 13:28:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:20:07.287 13:28:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:20:07.287 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.287 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:20:07.546 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:20:07.546 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:20:07.546 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:07.806 [2024-07-25 13:28:48.400052] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:07.806 [2024-07-25 13:28:48.400077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:07.806 [2024-07-25 13:28:48.400088] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1113890 00:20:07.806 [2024-07-25 13:28:48.400094] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:07.806 [2024-07-25 13:28:48.401343] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:07.806 [2024-07-25 13:28:48.401364] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:07.806 [2024-07-25 13:28:48.401410] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:07.806 [2024-07-25 13:28:48.401428] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:07.806 pt1 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.806 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:08.066 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:08.066 "name": "raid_bdev1", 00:20:08.066 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:08.066 "strip_size_kb": 64, 00:20:08.066 "state": "configuring", 00:20:08.066 "raid_level": "concat", 00:20:08.066 "superblock": true, 00:20:08.066 "num_base_bdevs": 4, 00:20:08.066 "num_base_bdevs_discovered": 1, 00:20:08.066 "num_base_bdevs_operational": 4, 00:20:08.066 "base_bdevs_list": [ 00:20:08.066 { 00:20:08.066 "name": "pt1", 00:20:08.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:08.066 "is_configured": true, 00:20:08.066 "data_offset": 2048, 00:20:08.066 "data_size": 63488 00:20:08.066 }, 00:20:08.066 { 00:20:08.066 "name": null, 00:20:08.066 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:08.066 "is_configured": false, 00:20:08.066 "data_offset": 2048, 00:20:08.066 "data_size": 63488 00:20:08.066 }, 00:20:08.066 { 00:20:08.066 "name": null, 00:20:08.066 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:08.066 "is_configured": false, 00:20:08.066 "data_offset": 2048, 00:20:08.066 "data_size": 63488 00:20:08.066 }, 00:20:08.066 { 00:20:08.066 "name": null, 00:20:08.066 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:08.066 "is_configured": false, 00:20:08.066 "data_offset": 2048, 00:20:08.066 "data_size": 63488 00:20:08.066 } 00:20:08.066 ] 00:20:08.066 }' 00:20:08.066 13:28:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:08.066 13:28:48 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.637 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:20:08.637 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:08.637 [2024-07-25 13:28:49.322398] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:08.637 [2024-07-25 13:28:49.322429] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:08.637 [2024-07-25 13:28:49.322440] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1118be0 00:20:08.637 [2024-07-25 13:28:49.322446] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:08.637 [2024-07-25 13:28:49.322721] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:08.637 [2024-07-25 13:28:49.322734] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:08.637 [2024-07-25 13:28:49.322777] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:08.637 [2024-07-25 13:28:49.322789] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:08.637 pt2 00:20:08.637 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:08.897 [2024-07-25 13:28:49.510882] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:08.897 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:09.156 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:09.156 "name": "raid_bdev1", 00:20:09.156 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:09.156 "strip_size_kb": 64, 00:20:09.156 "state": "configuring", 00:20:09.156 "raid_level": "concat", 00:20:09.156 "superblock": true, 00:20:09.156 "num_base_bdevs": 4, 00:20:09.156 "num_base_bdevs_discovered": 1, 00:20:09.156 "num_base_bdevs_operational": 4, 00:20:09.156 "base_bdevs_list": [ 00:20:09.156 { 00:20:09.156 "name": "pt1", 00:20:09.156 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:09.156 "is_configured": true, 00:20:09.156 "data_offset": 2048, 00:20:09.156 "data_size": 63488 00:20:09.156 }, 00:20:09.156 { 00:20:09.156 "name": null, 00:20:09.156 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:09.157 "is_configured": false, 00:20:09.157 "data_offset": 2048, 00:20:09.157 "data_size": 63488 00:20:09.157 }, 00:20:09.157 { 00:20:09.157 "name": null, 00:20:09.157 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:09.157 "is_configured": false, 00:20:09.157 "data_offset": 2048, 00:20:09.157 "data_size": 63488 00:20:09.157 }, 00:20:09.157 { 00:20:09.157 "name": null, 00:20:09.157 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:09.157 "is_configured": false, 00:20:09.157 "data_offset": 2048, 00:20:09.157 "data_size": 63488 00:20:09.157 } 00:20:09.157 ] 00:20:09.157 }' 00:20:09.157 13:28:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:09.157 13:28:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:09.726 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:20:09.726 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:09.726 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:09.726 [2024-07-25 13:28:50.449260] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:09.726 [2024-07-25 13:28:50.449295] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.726 [2024-07-25 13:28:50.449305] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110ff50 00:20:09.726 [2024-07-25 13:28:50.449311] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.726 [2024-07-25 13:28:50.449583] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.726 [2024-07-25 13:28:50.449594] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:09.726 [2024-07-25 13:28:50.449637] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:09.726 [2024-07-25 13:28:50.449649] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:09.726 pt2 00:20:09.726 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:20:09.726 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:09.726 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:09.986 [2024-07-25 13:28:50.637744] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:09.986 [2024-07-25 13:28:50.637767] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:09.986 [2024-07-25 13:28:50.637777] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11108e0 00:20:09.986 [2024-07-25 13:28:50.637783] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:09.986 [2024-07-25 13:28:50.638008] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:09.986 [2024-07-25 13:28:50.638018] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:09.986 [2024-07-25 13:28:50.638051] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:09.986 [2024-07-25 13:28:50.638062] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:09.986 pt3 00:20:09.986 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:20:09.986 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:09.986 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:10.246 [2024-07-25 13:28:50.834243] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:10.246 [2024-07-25 13:28:50.834264] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:10.246 [2024-07-25 13:28:50.834277] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1113bb0 00:20:10.246 [2024-07-25 13:28:50.834282] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:10.246 [2024-07-25 13:28:50.834495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:10.246 [2024-07-25 13:28:50.834505] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:10.246 [2024-07-25 13:28:50.834537] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:10.246 [2024-07-25 13:28:50.834553] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:10.246 [2024-07-25 13:28:50.834645] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1110480 00:20:10.246 [2024-07-25 13:28:50.834651] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:10.246 [2024-07-25 13:28:50.834783] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1112db0 00:20:10.246 [2024-07-25 13:28:50.834883] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1110480 00:20:10.246 [2024-07-25 13:28:50.834888] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1110480 00:20:10.246 [2024-07-25 13:28:50.834959] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:10.246 pt4 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:10.246 13:28:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:10.506 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:10.506 "name": "raid_bdev1", 00:20:10.506 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:10.506 "strip_size_kb": 64, 00:20:10.506 "state": "online", 00:20:10.506 "raid_level": "concat", 00:20:10.506 "superblock": true, 00:20:10.506 "num_base_bdevs": 4, 00:20:10.506 "num_base_bdevs_discovered": 4, 00:20:10.506 "num_base_bdevs_operational": 4, 00:20:10.506 "base_bdevs_list": [ 00:20:10.506 { 00:20:10.507 "name": "pt1", 00:20:10.507 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:10.507 "is_configured": true, 00:20:10.507 "data_offset": 2048, 00:20:10.507 "data_size": 63488 00:20:10.507 }, 00:20:10.507 { 00:20:10.507 "name": "pt2", 00:20:10.507 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:10.507 "is_configured": true, 00:20:10.507 "data_offset": 2048, 00:20:10.507 "data_size": 63488 00:20:10.507 }, 00:20:10.507 { 00:20:10.507 "name": "pt3", 00:20:10.507 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:10.507 "is_configured": true, 00:20:10.507 "data_offset": 2048, 00:20:10.507 "data_size": 63488 00:20:10.507 }, 00:20:10.507 { 00:20:10.507 "name": "pt4", 00:20:10.507 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:10.507 "is_configured": true, 00:20:10.507 "data_offset": 2048, 00:20:10.507 "data_size": 63488 00:20:10.507 } 00:20:10.507 ] 00:20:10.507 }' 00:20:10.507 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:10.507 13:28:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:11.077 [2024-07-25 13:28:51.788907] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:11.077 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:11.077 "name": "raid_bdev1", 00:20:11.077 "aliases": [ 00:20:11.077 "064579f7-756b-4309-8d08-f1bca32ccbfe" 00:20:11.077 ], 00:20:11.077 "product_name": "Raid Volume", 00:20:11.077 "block_size": 512, 00:20:11.077 "num_blocks": 253952, 00:20:11.077 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:11.077 "assigned_rate_limits": { 00:20:11.077 "rw_ios_per_sec": 0, 00:20:11.077 "rw_mbytes_per_sec": 0, 00:20:11.077 "r_mbytes_per_sec": 0, 00:20:11.077 "w_mbytes_per_sec": 0 00:20:11.077 }, 00:20:11.077 "claimed": false, 00:20:11.077 "zoned": false, 00:20:11.077 "supported_io_types": { 00:20:11.077 "read": true, 00:20:11.077 "write": true, 00:20:11.077 "unmap": true, 00:20:11.077 "flush": true, 00:20:11.077 "reset": true, 00:20:11.077 "nvme_admin": false, 00:20:11.077 "nvme_io": false, 00:20:11.077 "nvme_io_md": false, 00:20:11.077 "write_zeroes": true, 00:20:11.077 "zcopy": false, 00:20:11.077 "get_zone_info": false, 00:20:11.077 "zone_management": false, 00:20:11.077 "zone_append": false, 00:20:11.077 "compare": false, 00:20:11.077 "compare_and_write": false, 00:20:11.077 "abort": false, 00:20:11.077 "seek_hole": false, 00:20:11.077 "seek_data": false, 00:20:11.077 "copy": false, 00:20:11.077 "nvme_iov_md": false 00:20:11.077 }, 00:20:11.077 "memory_domains": [ 00:20:11.077 { 00:20:11.077 "dma_device_id": "system", 00:20:11.077 "dma_device_type": 1 00:20:11.077 }, 00:20:11.077 { 00:20:11.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.077 "dma_device_type": 2 00:20:11.077 }, 00:20:11.077 { 00:20:11.077 "dma_device_id": "system", 00:20:11.077 "dma_device_type": 1 00:20:11.077 }, 00:20:11.077 { 00:20:11.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.077 "dma_device_type": 2 00:20:11.077 }, 00:20:11.077 { 00:20:11.077 "dma_device_id": "system", 00:20:11.078 "dma_device_type": 1 00:20:11.078 }, 00:20:11.078 { 00:20:11.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.078 "dma_device_type": 2 00:20:11.078 }, 00:20:11.078 { 00:20:11.078 "dma_device_id": "system", 00:20:11.078 "dma_device_type": 1 00:20:11.078 }, 00:20:11.078 { 00:20:11.078 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.078 "dma_device_type": 2 00:20:11.078 } 00:20:11.078 ], 00:20:11.078 "driver_specific": { 00:20:11.078 "raid": { 00:20:11.078 "uuid": "064579f7-756b-4309-8d08-f1bca32ccbfe", 00:20:11.078 "strip_size_kb": 64, 00:20:11.078 "state": "online", 00:20:11.078 "raid_level": "concat", 00:20:11.078 "superblock": true, 00:20:11.078 "num_base_bdevs": 4, 00:20:11.078 "num_base_bdevs_discovered": 4, 00:20:11.078 "num_base_bdevs_operational": 4, 00:20:11.078 "base_bdevs_list": [ 00:20:11.078 { 00:20:11.078 "name": "pt1", 00:20:11.078 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:11.078 "is_configured": true, 00:20:11.078 "data_offset": 2048, 00:20:11.078 "data_size": 63488 00:20:11.078 }, 00:20:11.078 { 00:20:11.078 "name": "pt2", 00:20:11.078 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:11.078 "is_configured": true, 00:20:11.078 "data_offset": 2048, 00:20:11.078 "data_size": 63488 00:20:11.078 }, 00:20:11.078 { 00:20:11.078 "name": "pt3", 00:20:11.078 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:11.078 "is_configured": true, 00:20:11.078 "data_offset": 2048, 00:20:11.078 "data_size": 63488 00:20:11.078 }, 00:20:11.078 { 00:20:11.078 "name": "pt4", 00:20:11.078 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:11.078 "is_configured": true, 00:20:11.078 "data_offset": 2048, 00:20:11.078 "data_size": 63488 00:20:11.078 } 00:20:11.078 ] 00:20:11.078 } 00:20:11.078 } 00:20:11.078 }' 00:20:11.078 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:11.078 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:11.078 pt2 00:20:11.078 pt3 00:20:11.078 pt4' 00:20:11.078 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:11.078 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:11.078 13:28:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:11.338 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:11.338 "name": "pt1", 00:20:11.338 "aliases": [ 00:20:11.338 "00000000-0000-0000-0000-000000000001" 00:20:11.338 ], 00:20:11.338 "product_name": "passthru", 00:20:11.338 "block_size": 512, 00:20:11.338 "num_blocks": 65536, 00:20:11.338 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:11.338 "assigned_rate_limits": { 00:20:11.338 "rw_ios_per_sec": 0, 00:20:11.338 "rw_mbytes_per_sec": 0, 00:20:11.338 "r_mbytes_per_sec": 0, 00:20:11.338 "w_mbytes_per_sec": 0 00:20:11.338 }, 00:20:11.338 "claimed": true, 00:20:11.338 "claim_type": "exclusive_write", 00:20:11.338 "zoned": false, 00:20:11.338 "supported_io_types": { 00:20:11.338 "read": true, 00:20:11.338 "write": true, 00:20:11.338 "unmap": true, 00:20:11.338 "flush": true, 00:20:11.338 "reset": true, 00:20:11.338 "nvme_admin": false, 00:20:11.338 "nvme_io": false, 00:20:11.338 "nvme_io_md": false, 00:20:11.338 "write_zeroes": true, 00:20:11.338 "zcopy": true, 00:20:11.338 "get_zone_info": false, 00:20:11.338 "zone_management": false, 00:20:11.338 "zone_append": false, 00:20:11.338 "compare": false, 00:20:11.338 "compare_and_write": false, 00:20:11.338 "abort": true, 00:20:11.338 "seek_hole": false, 00:20:11.338 "seek_data": false, 00:20:11.338 "copy": true, 00:20:11.338 "nvme_iov_md": false 00:20:11.338 }, 00:20:11.338 "memory_domains": [ 00:20:11.338 { 00:20:11.338 "dma_device_id": "system", 00:20:11.338 "dma_device_type": 1 00:20:11.338 }, 00:20:11.338 { 00:20:11.338 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.338 "dma_device_type": 2 00:20:11.338 } 00:20:11.338 ], 00:20:11.338 "driver_specific": { 00:20:11.338 "passthru": { 00:20:11.338 "name": "pt1", 00:20:11.338 "base_bdev_name": "malloc1" 00:20:11.338 } 00:20:11.338 } 00:20:11.338 }' 00:20:11.338 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.338 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.598 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.859 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:11.859 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:11.859 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:11.859 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:11.859 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:11.859 "name": "pt2", 00:20:11.859 "aliases": [ 00:20:11.859 "00000000-0000-0000-0000-000000000002" 00:20:11.859 ], 00:20:11.859 "product_name": "passthru", 00:20:11.859 "block_size": 512, 00:20:11.859 "num_blocks": 65536, 00:20:11.859 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:11.859 "assigned_rate_limits": { 00:20:11.859 "rw_ios_per_sec": 0, 00:20:11.859 "rw_mbytes_per_sec": 0, 00:20:11.859 "r_mbytes_per_sec": 0, 00:20:11.859 "w_mbytes_per_sec": 0 00:20:11.859 }, 00:20:11.859 "claimed": true, 00:20:11.859 "claim_type": "exclusive_write", 00:20:11.859 "zoned": false, 00:20:11.859 "supported_io_types": { 00:20:11.859 "read": true, 00:20:11.859 "write": true, 00:20:11.859 "unmap": true, 00:20:11.859 "flush": true, 00:20:11.859 "reset": true, 00:20:11.859 "nvme_admin": false, 00:20:11.859 "nvme_io": false, 00:20:11.859 "nvme_io_md": false, 00:20:11.859 "write_zeroes": true, 00:20:11.859 "zcopy": true, 00:20:11.859 "get_zone_info": false, 00:20:11.859 "zone_management": false, 00:20:11.859 "zone_append": false, 00:20:11.859 "compare": false, 00:20:11.859 "compare_and_write": false, 00:20:11.859 "abort": true, 00:20:11.859 "seek_hole": false, 00:20:11.859 "seek_data": false, 00:20:11.859 "copy": true, 00:20:11.859 "nvme_iov_md": false 00:20:11.859 }, 00:20:11.859 "memory_domains": [ 00:20:11.859 { 00:20:11.859 "dma_device_id": "system", 00:20:11.859 "dma_device_type": 1 00:20:11.859 }, 00:20:11.859 { 00:20:11.859 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:11.859 "dma_device_type": 2 00:20:11.859 } 00:20:11.859 ], 00:20:11.859 "driver_specific": { 00:20:11.859 "passthru": { 00:20:11.859 "name": "pt2", 00:20:11.859 "base_bdev_name": "malloc2" 00:20:11.859 } 00:20:11.859 } 00:20:11.859 }' 00:20:11.859 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:12.120 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.380 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.380 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:12.380 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:12.380 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:12.380 13:28:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:12.640 "name": "pt3", 00:20:12.640 "aliases": [ 00:20:12.640 "00000000-0000-0000-0000-000000000003" 00:20:12.640 ], 00:20:12.640 "product_name": "passthru", 00:20:12.640 "block_size": 512, 00:20:12.640 "num_blocks": 65536, 00:20:12.640 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:12.640 "assigned_rate_limits": { 00:20:12.640 "rw_ios_per_sec": 0, 00:20:12.640 "rw_mbytes_per_sec": 0, 00:20:12.640 "r_mbytes_per_sec": 0, 00:20:12.640 "w_mbytes_per_sec": 0 00:20:12.640 }, 00:20:12.640 "claimed": true, 00:20:12.640 "claim_type": "exclusive_write", 00:20:12.640 "zoned": false, 00:20:12.640 "supported_io_types": { 00:20:12.640 "read": true, 00:20:12.640 "write": true, 00:20:12.640 "unmap": true, 00:20:12.640 "flush": true, 00:20:12.640 "reset": true, 00:20:12.640 "nvme_admin": false, 00:20:12.640 "nvme_io": false, 00:20:12.640 "nvme_io_md": false, 00:20:12.640 "write_zeroes": true, 00:20:12.640 "zcopy": true, 00:20:12.640 "get_zone_info": false, 00:20:12.640 "zone_management": false, 00:20:12.640 "zone_append": false, 00:20:12.640 "compare": false, 00:20:12.640 "compare_and_write": false, 00:20:12.640 "abort": true, 00:20:12.640 "seek_hole": false, 00:20:12.640 "seek_data": false, 00:20:12.640 "copy": true, 00:20:12.640 "nvme_iov_md": false 00:20:12.640 }, 00:20:12.640 "memory_domains": [ 00:20:12.640 { 00:20:12.640 "dma_device_id": "system", 00:20:12.640 "dma_device_type": 1 00:20:12.640 }, 00:20:12.640 { 00:20:12.640 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:12.640 "dma_device_type": 2 00:20:12.640 } 00:20:12.640 ], 00:20:12.640 "driver_specific": { 00:20:12.640 "passthru": { 00:20:12.640 "name": "pt3", 00:20:12.640 "base_bdev_name": "malloc3" 00:20:12.640 } 00:20:12.640 } 00:20:12.640 }' 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.640 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:12.899 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:12.899 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.899 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:12.899 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:12.899 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:12.899 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:12.899 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:13.159 "name": "pt4", 00:20:13.159 "aliases": [ 00:20:13.159 "00000000-0000-0000-0000-000000000004" 00:20:13.159 ], 00:20:13.159 "product_name": "passthru", 00:20:13.159 "block_size": 512, 00:20:13.159 "num_blocks": 65536, 00:20:13.159 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:13.159 "assigned_rate_limits": { 00:20:13.159 "rw_ios_per_sec": 0, 00:20:13.159 "rw_mbytes_per_sec": 0, 00:20:13.159 "r_mbytes_per_sec": 0, 00:20:13.159 "w_mbytes_per_sec": 0 00:20:13.159 }, 00:20:13.159 "claimed": true, 00:20:13.159 "claim_type": "exclusive_write", 00:20:13.159 "zoned": false, 00:20:13.159 "supported_io_types": { 00:20:13.159 "read": true, 00:20:13.159 "write": true, 00:20:13.159 "unmap": true, 00:20:13.159 "flush": true, 00:20:13.159 "reset": true, 00:20:13.159 "nvme_admin": false, 00:20:13.159 "nvme_io": false, 00:20:13.159 "nvme_io_md": false, 00:20:13.159 "write_zeroes": true, 00:20:13.159 "zcopy": true, 00:20:13.159 "get_zone_info": false, 00:20:13.159 "zone_management": false, 00:20:13.159 "zone_append": false, 00:20:13.159 "compare": false, 00:20:13.159 "compare_and_write": false, 00:20:13.159 "abort": true, 00:20:13.159 "seek_hole": false, 00:20:13.159 "seek_data": false, 00:20:13.159 "copy": true, 00:20:13.159 "nvme_iov_md": false 00:20:13.159 }, 00:20:13.159 "memory_domains": [ 00:20:13.159 { 00:20:13.159 "dma_device_id": "system", 00:20:13.159 "dma_device_type": 1 00:20:13.159 }, 00:20:13.159 { 00:20:13.159 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:13.159 "dma_device_type": 2 00:20:13.159 } 00:20:13.159 ], 00:20:13.159 "driver_specific": { 00:20:13.159 "passthru": { 00:20:13.159 "name": "pt4", 00:20:13.159 "base_bdev_name": "malloc4" 00:20:13.159 } 00:20:13.159 } 00:20:13.159 }' 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.159 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:13.419 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:13.419 13:28:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.419 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:13.419 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:13.419 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:13.419 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:20:13.679 [2024-07-25 13:28:54.247140] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' 064579f7-756b-4309-8d08-f1bca32ccbfe '!=' 064579f7-756b-4309-8d08-f1bca32ccbfe ']' 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy concat 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 971127 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 971127 ']' 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 971127 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 971127 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 971127' 00:20:13.679 killing process with pid 971127 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 971127 00:20:13.679 [2024-07-25 13:28:54.316420] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:13.679 [2024-07-25 13:28:54.316464] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:13.679 [2024-07-25 13:28:54.316511] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:13.679 [2024-07-25 13:28:54.316517] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1110480 name raid_bdev1, state offline 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 971127 00:20:13.679 [2024-07-25 13:28:54.337041] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:20:13.679 00:20:13.679 real 0m13.905s 00:20:13.679 user 0m25.623s 00:20:13.679 sys 0m2.076s 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:13.679 13:28:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.679 ************************************ 00:20:13.679 END TEST raid_superblock_test 00:20:13.679 ************************************ 00:20:13.941 13:28:54 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:20:13.941 13:28:54 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:13.941 13:28:54 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:13.941 13:28:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:13.941 ************************************ 00:20:13.941 START TEST raid_read_error_test 00:20:13.941 ************************************ 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 read 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.CCsSc7foZG 00:20:13.941 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=973722 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 973722 /var/tmp/spdk-raid.sock 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 973722 ']' 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:13.942 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:13.942 13:28:54 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:13.942 [2024-07-25 13:28:54.609627] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:20:13.942 [2024-07-25 13:28:54.609687] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid973722 ] 00:20:13.942 [2024-07-25 13:28:54.700653] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:14.202 [2024-07-25 13:28:54.768925] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:14.202 [2024-07-25 13:28:54.815768] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:14.202 [2024-07-25 13:28:54.815793] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:14.772 13:28:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:14.772 13:28:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:20:14.772 13:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:14.772 13:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:15.032 BaseBdev1_malloc 00:20:15.032 13:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:15.292 true 00:20:15.292 13:28:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:15.292 [2024-07-25 13:28:56.031091] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:15.292 [2024-07-25 13:28:56.031122] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:15.292 [2024-07-25 13:28:56.031133] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8d52a0 00:20:15.292 [2024-07-25 13:28:56.031140] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:15.292 [2024-07-25 13:28:56.032383] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:15.292 [2024-07-25 13:28:56.032401] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:15.292 BaseBdev1 00:20:15.292 13:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:15.292 13:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:15.552 BaseBdev2_malloc 00:20:15.552 13:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:15.812 true 00:20:15.812 13:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:16.072 [2024-07-25 13:28:56.638180] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:16.072 [2024-07-25 13:28:56.638208] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.072 [2024-07-25 13:28:56.638220] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x994420 00:20:16.072 [2024-07-25 13:28:56.638226] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.072 [2024-07-25 13:28:56.639417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.072 [2024-07-25 13:28:56.639435] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:16.072 BaseBdev2 00:20:16.072 13:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:16.072 13:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:16.072 BaseBdev3_malloc 00:20:16.072 13:28:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:16.335 true 00:20:16.335 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:16.612 [2024-07-25 13:28:57.145284] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:16.612 [2024-07-25 13:28:57.145310] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:16.612 [2024-07-25 13:28:57.145322] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x995f70 00:20:16.612 [2024-07-25 13:28:57.145328] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:16.612 [2024-07-25 13:28:57.146480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:16.612 [2024-07-25 13:28:57.146499] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:16.612 BaseBdev3 00:20:16.612 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:16.612 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:16.612 BaseBdev4_malloc 00:20:16.612 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:16.888 true 00:20:16.888 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:17.148 [2024-07-25 13:28:57.716411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:17.148 [2024-07-25 13:28:57.716438] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:17.148 [2024-07-25 13:28:57.716449] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9991e0 00:20:17.148 [2024-07-25 13:28:57.716455] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:17.148 [2024-07-25 13:28:57.717637] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:17.148 [2024-07-25 13:28:57.717656] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:17.148 BaseBdev4 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:17.149 [2024-07-25 13:28:57.904920] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:17.149 [2024-07-25 13:28:57.905969] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:17.149 [2024-07-25 13:28:57.906024] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:17.149 [2024-07-25 13:28:57.906070] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:17.149 [2024-07-25 13:28:57.906232] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x999800 00:20:17.149 [2024-07-25 13:28:57.906239] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:17.149 [2024-07-25 13:28:57.906393] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x994950 00:20:17.149 [2024-07-25 13:28:57.906510] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x999800 00:20:17.149 [2024-07-25 13:28:57.906515] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x999800 00:20:17.149 [2024-07-25 13:28:57.906608] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:17.149 13:28:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:17.409 13:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:17.409 "name": "raid_bdev1", 00:20:17.409 "uuid": "9ea86cfd-e39e-4ef2-a7f9-da138b567621", 00:20:17.409 "strip_size_kb": 64, 00:20:17.409 "state": "online", 00:20:17.409 "raid_level": "concat", 00:20:17.409 "superblock": true, 00:20:17.409 "num_base_bdevs": 4, 00:20:17.409 "num_base_bdevs_discovered": 4, 00:20:17.409 "num_base_bdevs_operational": 4, 00:20:17.409 "base_bdevs_list": [ 00:20:17.409 { 00:20:17.409 "name": "BaseBdev1", 00:20:17.409 "uuid": "322d2dfc-c8c1-5b5a-859e-afaf30471b55", 00:20:17.409 "is_configured": true, 00:20:17.409 "data_offset": 2048, 00:20:17.409 "data_size": 63488 00:20:17.409 }, 00:20:17.409 { 00:20:17.409 "name": "BaseBdev2", 00:20:17.409 "uuid": "c857c68b-97bd-509c-b490-9da41b828add", 00:20:17.409 "is_configured": true, 00:20:17.409 "data_offset": 2048, 00:20:17.409 "data_size": 63488 00:20:17.409 }, 00:20:17.409 { 00:20:17.409 "name": "BaseBdev3", 00:20:17.409 "uuid": "f8ce938f-3b7d-509f-ade6-515bc8a90334", 00:20:17.409 "is_configured": true, 00:20:17.409 "data_offset": 2048, 00:20:17.409 "data_size": 63488 00:20:17.409 }, 00:20:17.409 { 00:20:17.409 "name": "BaseBdev4", 00:20:17.409 "uuid": "86a5c097-1f37-5437-a3d1-8f9ea1a91b70", 00:20:17.409 "is_configured": true, 00:20:17.409 "data_offset": 2048, 00:20:17.409 "data_size": 63488 00:20:17.409 } 00:20:17.409 ] 00:20:17.409 }' 00:20:17.409 13:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:17.409 13:28:58 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:17.978 13:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:20:17.978 13:28:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:17.978 [2024-07-25 13:28:58.759398] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x99a0e0 00:20:18.916 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.176 13:28:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:19.436 13:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:19.436 "name": "raid_bdev1", 00:20:19.436 "uuid": "9ea86cfd-e39e-4ef2-a7f9-da138b567621", 00:20:19.436 "strip_size_kb": 64, 00:20:19.436 "state": "online", 00:20:19.436 "raid_level": "concat", 00:20:19.436 "superblock": true, 00:20:19.436 "num_base_bdevs": 4, 00:20:19.436 "num_base_bdevs_discovered": 4, 00:20:19.436 "num_base_bdevs_operational": 4, 00:20:19.436 "base_bdevs_list": [ 00:20:19.436 { 00:20:19.436 "name": "BaseBdev1", 00:20:19.436 "uuid": "322d2dfc-c8c1-5b5a-859e-afaf30471b55", 00:20:19.436 "is_configured": true, 00:20:19.436 "data_offset": 2048, 00:20:19.436 "data_size": 63488 00:20:19.436 }, 00:20:19.436 { 00:20:19.436 "name": "BaseBdev2", 00:20:19.436 "uuid": "c857c68b-97bd-509c-b490-9da41b828add", 00:20:19.436 "is_configured": true, 00:20:19.436 "data_offset": 2048, 00:20:19.436 "data_size": 63488 00:20:19.436 }, 00:20:19.436 { 00:20:19.436 "name": "BaseBdev3", 00:20:19.436 "uuid": "f8ce938f-3b7d-509f-ade6-515bc8a90334", 00:20:19.436 "is_configured": true, 00:20:19.436 "data_offset": 2048, 00:20:19.436 "data_size": 63488 00:20:19.436 }, 00:20:19.436 { 00:20:19.436 "name": "BaseBdev4", 00:20:19.436 "uuid": "86a5c097-1f37-5437-a3d1-8f9ea1a91b70", 00:20:19.436 "is_configured": true, 00:20:19.436 "data_offset": 2048, 00:20:19.436 "data_size": 63488 00:20:19.436 } 00:20:19.436 ] 00:20:19.436 }' 00:20:19.436 13:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:19.436 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.006 13:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:20.265 [2024-07-25 13:29:00.803274] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:20.265 [2024-07-25 13:29:00.803300] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:20.265 [2024-07-25 13:29:00.805880] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:20.265 [2024-07-25 13:29:00.805908] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:20.265 [2024-07-25 13:29:00.805937] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:20.265 [2024-07-25 13:29:00.805942] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x999800 name raid_bdev1, state offline 00:20:20.265 0 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 973722 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 973722 ']' 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 973722 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 973722 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 973722' 00:20:20.265 killing process with pid 973722 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 973722 00:20:20.265 [2024-07-25 13:29:00.869190] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:20.265 13:29:00 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 973722 00:20:20.265 [2024-07-25 13:29:00.886466] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.CCsSc7foZG 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.49 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.49 != \0\.\0\0 ]] 00:20:20.266 00:20:20.266 real 0m6.486s 00:20:20.266 user 0m10.417s 00:20:20.266 sys 0m0.936s 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:20.266 13:29:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.266 ************************************ 00:20:20.266 END TEST raid_read_error_test 00:20:20.266 ************************************ 00:20:20.525 13:29:01 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:20:20.525 13:29:01 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:20.525 13:29:01 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:20.525 13:29:01 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:20.525 ************************************ 00:20:20.525 START TEST raid_write_error_test 00:20:20.525 ************************************ 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test concat 4 write 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=concat 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' concat '!=' raid1 ']' 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # strip_size=64 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@816 -- # create_arg+=' -z 64' 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.FZdVDADujS 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=974916 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 974916 /var/tmp/spdk-raid.sock 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 974916 ']' 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:20.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:20.525 13:29:01 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:20.525 [2024-07-25 13:29:01.170871] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:20:20.526 [2024-07-25 13:29:01.170915] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid974916 ] 00:20:20.526 [2024-07-25 13:29:01.258626] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:20.784 [2024-07-25 13:29:01.321095] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:20.785 [2024-07-25 13:29:01.360685] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:20.785 [2024-07-25 13:29:01.360708] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:21.353 13:29:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:21.353 13:29:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:20:21.353 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:21.353 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:20:21.611 BaseBdev1_malloc 00:20:21.611 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:20:21.611 true 00:20:21.611 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:20:21.870 [2024-07-25 13:29:02.559431] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:20:21.870 [2024-07-25 13:29:02.559463] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:21.870 [2024-07-25 13:29:02.559474] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1af32a0 00:20:21.870 [2024-07-25 13:29:02.559480] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:21.870 [2024-07-25 13:29:02.560782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:21.870 [2024-07-25 13:29:02.560801] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:20:21.870 BaseBdev1 00:20:21.870 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:21.870 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:20:22.129 BaseBdev2_malloc 00:20:22.129 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:20:22.388 true 00:20:22.388 13:29:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:20:22.388 [2024-07-25 13:29:03.170796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:20:22.388 [2024-07-25 13:29:03.170827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:22.388 [2024-07-25 13:29:03.170839] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb2420 00:20:22.388 [2024-07-25 13:29:03.170846] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:22.388 [2024-07-25 13:29:03.172038] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:22.388 [2024-07-25 13:29:03.172057] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:20:22.388 BaseBdev2 00:20:22.651 13:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:22.651 13:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:20:22.651 BaseBdev3_malloc 00:20:22.651 13:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:20:22.913 true 00:20:22.913 13:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:20:23.174 [2024-07-25 13:29:03.769894] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:20:23.174 [2024-07-25 13:29:03.769919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:23.174 [2024-07-25 13:29:03.769931] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb3f70 00:20:23.174 [2024-07-25 13:29:03.769938] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:23.174 [2024-07-25 13:29:03.771088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:23.174 [2024-07-25 13:29:03.771107] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:20:23.174 BaseBdev3 00:20:23.174 13:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:20:23.174 13:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:20:23.433 BaseBdev4_malloc 00:20:23.433 13:29:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:20:23.433 true 00:20:23.433 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:20:23.694 [2024-07-25 13:29:04.365162] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:20:23.694 [2024-07-25 13:29:04.365190] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:23.694 [2024-07-25 13:29:04.365201] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bb71e0 00:20:23.694 [2024-07-25 13:29:04.365207] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:23.694 [2024-07-25 13:29:04.366369] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:23.694 [2024-07-25 13:29:04.366387] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:20:23.694 BaseBdev4 00:20:23.694 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:20:23.953 [2024-07-25 13:29:04.573715] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:23.953 [2024-07-25 13:29:04.574754] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:23.953 [2024-07-25 13:29:04.574808] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:23.953 [2024-07-25 13:29:04.574853] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:23.953 [2024-07-25 13:29:04.575018] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bb7800 00:20:23.953 [2024-07-25 13:29:04.575025] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:20:23.953 [2024-07-25 13:29:04.575238] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb2950 00:20:23.953 [2024-07-25 13:29:04.575354] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bb7800 00:20:23.953 [2024-07-25 13:29:04.575359] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1bb7800 00:20:23.953 [2024-07-25 13:29:04.575443] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:23.953 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:23.954 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:23.954 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:24.213 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:24.213 "name": "raid_bdev1", 00:20:24.214 "uuid": "5d9777ac-cf6b-4057-b20a-844851828334", 00:20:24.214 "strip_size_kb": 64, 00:20:24.214 "state": "online", 00:20:24.214 "raid_level": "concat", 00:20:24.214 "superblock": true, 00:20:24.214 "num_base_bdevs": 4, 00:20:24.214 "num_base_bdevs_discovered": 4, 00:20:24.214 "num_base_bdevs_operational": 4, 00:20:24.214 "base_bdevs_list": [ 00:20:24.214 { 00:20:24.214 "name": "BaseBdev1", 00:20:24.214 "uuid": "48f2f5fb-1fdc-542f-b7b7-a31ee56f7bb2", 00:20:24.214 "is_configured": true, 00:20:24.214 "data_offset": 2048, 00:20:24.214 "data_size": 63488 00:20:24.214 }, 00:20:24.214 { 00:20:24.214 "name": "BaseBdev2", 00:20:24.214 "uuid": "084a62cb-7259-5f61-960e-36ae29f4cb1e", 00:20:24.214 "is_configured": true, 00:20:24.214 "data_offset": 2048, 00:20:24.214 "data_size": 63488 00:20:24.214 }, 00:20:24.214 { 00:20:24.214 "name": "BaseBdev3", 00:20:24.214 "uuid": "61733b9d-eb8e-50cb-87bb-c5d73874631e", 00:20:24.214 "is_configured": true, 00:20:24.214 "data_offset": 2048, 00:20:24.214 "data_size": 63488 00:20:24.214 }, 00:20:24.214 { 00:20:24.214 "name": "BaseBdev4", 00:20:24.214 "uuid": "5ede048c-ac5b-5012-9f89-658e27b35ed7", 00:20:24.214 "is_configured": true, 00:20:24.214 "data_offset": 2048, 00:20:24.214 "data_size": 63488 00:20:24.214 } 00:20:24.214 ] 00:20:24.214 }' 00:20:24.214 13:29:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:24.214 13:29:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:24.783 13:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:20:24.783 13:29:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:20:24.783 [2024-07-25 13:29:05.452129] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bb80e0 00:20:25.722 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ concat = \r\a\i\d\1 ]] 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.982 "name": "raid_bdev1", 00:20:25.982 "uuid": "5d9777ac-cf6b-4057-b20a-844851828334", 00:20:25.982 "strip_size_kb": 64, 00:20:25.982 "state": "online", 00:20:25.982 "raid_level": "concat", 00:20:25.982 "superblock": true, 00:20:25.982 "num_base_bdevs": 4, 00:20:25.982 "num_base_bdevs_discovered": 4, 00:20:25.982 "num_base_bdevs_operational": 4, 00:20:25.982 "base_bdevs_list": [ 00:20:25.982 { 00:20:25.982 "name": "BaseBdev1", 00:20:25.982 "uuid": "48f2f5fb-1fdc-542f-b7b7-a31ee56f7bb2", 00:20:25.982 "is_configured": true, 00:20:25.982 "data_offset": 2048, 00:20:25.982 "data_size": 63488 00:20:25.982 }, 00:20:25.982 { 00:20:25.982 "name": "BaseBdev2", 00:20:25.982 "uuid": "084a62cb-7259-5f61-960e-36ae29f4cb1e", 00:20:25.982 "is_configured": true, 00:20:25.982 "data_offset": 2048, 00:20:25.982 "data_size": 63488 00:20:25.982 }, 00:20:25.982 { 00:20:25.982 "name": "BaseBdev3", 00:20:25.982 "uuid": "61733b9d-eb8e-50cb-87bb-c5d73874631e", 00:20:25.982 "is_configured": true, 00:20:25.982 "data_offset": 2048, 00:20:25.982 "data_size": 63488 00:20:25.982 }, 00:20:25.982 { 00:20:25.982 "name": "BaseBdev4", 00:20:25.982 "uuid": "5ede048c-ac5b-5012-9f89-658e27b35ed7", 00:20:25.982 "is_configured": true, 00:20:25.982 "data_offset": 2048, 00:20:25.982 "data_size": 63488 00:20:25.982 } 00:20:25.982 ] 00:20:25.982 }' 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.982 13:29:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:26.582 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:26.842 [2024-07-25 13:29:07.475135] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:26.843 [2024-07-25 13:29:07.475169] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:26.843 [2024-07-25 13:29:07.477755] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:26.843 [2024-07-25 13:29:07.477784] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:26.843 [2024-07-25 13:29:07.477813] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:26.843 [2024-07-25 13:29:07.477818] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb7800 name raid_bdev1, state offline 00:20:26.843 0 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 974916 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 974916 ']' 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 974916 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 974916 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 974916' 00:20:26.843 killing process with pid 974916 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 974916 00:20:26.843 [2024-07-25 13:29:07.558650] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:26.843 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 974916 00:20:26.843 [2024-07-25 13:29:07.575719] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.FZdVDADujS 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.50 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy concat 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@863 -- # [[ 0.50 != \0\.\0\0 ]] 00:20:27.104 00:20:27.104 real 0m6.612s 00:20:27.104 user 0m10.670s 00:20:27.104 sys 0m0.938s 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:27.104 13:29:07 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.104 ************************************ 00:20:27.104 END TEST raid_write_error_test 00:20:27.104 ************************************ 00:20:27.104 13:29:07 bdev_raid -- bdev/bdev_raid.sh@946 -- # for level in raid0 concat raid1 00:20:27.104 13:29:07 bdev_raid -- bdev/bdev_raid.sh@947 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:20:27.104 13:29:07 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:27.104 13:29:07 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:27.104 13:29:07 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:27.104 ************************************ 00:20:27.104 START TEST raid_state_function_test 00:20:27.104 ************************************ 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 false 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:27.104 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=976085 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 976085' 00:20:27.105 Process raid pid: 976085 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 976085 /var/tmp/spdk-raid.sock 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@831 -- # '[' -z 976085 ']' 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:27.105 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:27.105 13:29:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:27.105 [2024-07-25 13:29:07.860359] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:20:27.105 [2024-07-25 13:29:07.860417] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:27.364 [2024-07-25 13:29:07.951507] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:27.364 [2024-07-25 13:29:08.020862] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:27.364 [2024-07-25 13:29:08.071009] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:27.364 [2024-07-25 13:29:08.071035] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:27.934 13:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:27.934 13:29:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@864 -- # return 0 00:20:27.934 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:28.194 [2024-07-25 13:29:08.874396] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:28.194 [2024-07-25 13:29:08.874425] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:28.194 [2024-07-25 13:29:08.874432] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:28.194 [2024-07-25 13:29:08.874437] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:28.194 [2024-07-25 13:29:08.874442] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:28.194 [2024-07-25 13:29:08.874447] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:28.194 [2024-07-25 13:29:08.874452] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:28.194 [2024-07-25 13:29:08.874457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.194 13:29:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:28.454 13:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:28.454 "name": "Existed_Raid", 00:20:28.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.454 "strip_size_kb": 0, 00:20:28.454 "state": "configuring", 00:20:28.454 "raid_level": "raid1", 00:20:28.454 "superblock": false, 00:20:28.454 "num_base_bdevs": 4, 00:20:28.454 "num_base_bdevs_discovered": 0, 00:20:28.454 "num_base_bdevs_operational": 4, 00:20:28.454 "base_bdevs_list": [ 00:20:28.454 { 00:20:28.454 "name": "BaseBdev1", 00:20:28.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.454 "is_configured": false, 00:20:28.454 "data_offset": 0, 00:20:28.454 "data_size": 0 00:20:28.454 }, 00:20:28.454 { 00:20:28.454 "name": "BaseBdev2", 00:20:28.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.454 "is_configured": false, 00:20:28.454 "data_offset": 0, 00:20:28.454 "data_size": 0 00:20:28.454 }, 00:20:28.454 { 00:20:28.454 "name": "BaseBdev3", 00:20:28.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.454 "is_configured": false, 00:20:28.454 "data_offset": 0, 00:20:28.454 "data_size": 0 00:20:28.454 }, 00:20:28.454 { 00:20:28.454 "name": "BaseBdev4", 00:20:28.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:28.454 "is_configured": false, 00:20:28.454 "data_offset": 0, 00:20:28.454 "data_size": 0 00:20:28.454 } 00:20:28.454 ] 00:20:28.454 }' 00:20:28.454 13:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:28.454 13:29:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:29.022 13:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:29.022 [2024-07-25 13:29:09.804659] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:29.022 [2024-07-25 13:29:09.804678] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27906f0 name Existed_Raid, state configuring 00:20:29.281 13:29:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:29.541 [2024-07-25 13:29:10.329999] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:29.541 [2024-07-25 13:29:10.330026] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:29.541 [2024-07-25 13:29:10.330032] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:29.541 [2024-07-25 13:29:10.330037] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:29.541 [2024-07-25 13:29:10.330042] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:29.541 [2024-07-25 13:29:10.330047] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:29.541 [2024-07-25 13:29:10.330052] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:29.541 [2024-07-25 13:29:10.330057] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:29.801 13:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:30.371 [2024-07-25 13:29:10.877930] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:30.371 BaseBdev1 00:20:30.371 13:29:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:30.371 13:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:30.371 13:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:30.371 13:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:30.371 13:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:30.371 13:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:30.371 13:29:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:30.940 13:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:31.199 [ 00:20:31.199 { 00:20:31.199 "name": "BaseBdev1", 00:20:31.199 "aliases": [ 00:20:31.199 "a2e66ddd-f076-4d34-9792-d2dec51ed018" 00:20:31.199 ], 00:20:31.199 "product_name": "Malloc disk", 00:20:31.199 "block_size": 512, 00:20:31.199 "num_blocks": 65536, 00:20:31.199 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:31.199 "assigned_rate_limits": { 00:20:31.199 "rw_ios_per_sec": 0, 00:20:31.199 "rw_mbytes_per_sec": 0, 00:20:31.199 "r_mbytes_per_sec": 0, 00:20:31.199 "w_mbytes_per_sec": 0 00:20:31.199 }, 00:20:31.199 "claimed": true, 00:20:31.199 "claim_type": "exclusive_write", 00:20:31.199 "zoned": false, 00:20:31.199 "supported_io_types": { 00:20:31.199 "read": true, 00:20:31.199 "write": true, 00:20:31.199 "unmap": true, 00:20:31.199 "flush": true, 00:20:31.199 "reset": true, 00:20:31.199 "nvme_admin": false, 00:20:31.199 "nvme_io": false, 00:20:31.199 "nvme_io_md": false, 00:20:31.199 "write_zeroes": true, 00:20:31.199 "zcopy": true, 00:20:31.199 "get_zone_info": false, 00:20:31.199 "zone_management": false, 00:20:31.200 "zone_append": false, 00:20:31.200 "compare": false, 00:20:31.200 "compare_and_write": false, 00:20:31.200 "abort": true, 00:20:31.200 "seek_hole": false, 00:20:31.200 "seek_data": false, 00:20:31.200 "copy": true, 00:20:31.200 "nvme_iov_md": false 00:20:31.200 }, 00:20:31.200 "memory_domains": [ 00:20:31.200 { 00:20:31.200 "dma_device_id": "system", 00:20:31.200 "dma_device_type": 1 00:20:31.200 }, 00:20:31.200 { 00:20:31.200 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:31.200 "dma_device_type": 2 00:20:31.200 } 00:20:31.200 ], 00:20:31.200 "driver_specific": {} 00:20:31.200 } 00:20:31.200 ] 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.200 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.459 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.459 13:29:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:32.027 13:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:32.027 "name": "Existed_Raid", 00:20:32.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.027 "strip_size_kb": 0, 00:20:32.027 "state": "configuring", 00:20:32.027 "raid_level": "raid1", 00:20:32.027 "superblock": false, 00:20:32.027 "num_base_bdevs": 4, 00:20:32.027 "num_base_bdevs_discovered": 1, 00:20:32.027 "num_base_bdevs_operational": 4, 00:20:32.027 "base_bdevs_list": [ 00:20:32.027 { 00:20:32.027 "name": "BaseBdev1", 00:20:32.027 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:32.027 "is_configured": true, 00:20:32.027 "data_offset": 0, 00:20:32.027 "data_size": 65536 00:20:32.027 }, 00:20:32.027 { 00:20:32.027 "name": "BaseBdev2", 00:20:32.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.027 "is_configured": false, 00:20:32.027 "data_offset": 0, 00:20:32.027 "data_size": 0 00:20:32.027 }, 00:20:32.027 { 00:20:32.027 "name": "BaseBdev3", 00:20:32.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.027 "is_configured": false, 00:20:32.027 "data_offset": 0, 00:20:32.027 "data_size": 0 00:20:32.027 }, 00:20:32.027 { 00:20:32.027 "name": "BaseBdev4", 00:20:32.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.027 "is_configured": false, 00:20:32.027 "data_offset": 0, 00:20:32.027 "data_size": 0 00:20:32.027 } 00:20:32.027 ] 00:20:32.027 }' 00:20:32.027 13:29:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:32.027 13:29:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:32.596 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:32.596 [2024-07-25 13:29:13.263976] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:32.596 [2024-07-25 13:29:13.264005] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x278ff60 name Existed_Raid, state configuring 00:20:32.596 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:32.855 [2024-07-25 13:29:13.464504] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:32.855 [2024-07-25 13:29:13.465678] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:32.855 [2024-07-25 13:29:13.465701] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:32.855 [2024-07-25 13:29:13.465707] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:32.855 [2024-07-25 13:29:13.465713] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:32.855 [2024-07-25 13:29:13.465719] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:32.855 [2024-07-25 13:29:13.465724] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.856 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:33.115 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:33.115 "name": "Existed_Raid", 00:20:33.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.115 "strip_size_kb": 0, 00:20:33.115 "state": "configuring", 00:20:33.115 "raid_level": "raid1", 00:20:33.115 "superblock": false, 00:20:33.115 "num_base_bdevs": 4, 00:20:33.115 "num_base_bdevs_discovered": 1, 00:20:33.115 "num_base_bdevs_operational": 4, 00:20:33.115 "base_bdevs_list": [ 00:20:33.115 { 00:20:33.115 "name": "BaseBdev1", 00:20:33.115 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:33.115 "is_configured": true, 00:20:33.115 "data_offset": 0, 00:20:33.115 "data_size": 65536 00:20:33.115 }, 00:20:33.115 { 00:20:33.115 "name": "BaseBdev2", 00:20:33.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.115 "is_configured": false, 00:20:33.115 "data_offset": 0, 00:20:33.115 "data_size": 0 00:20:33.115 }, 00:20:33.115 { 00:20:33.115 "name": "BaseBdev3", 00:20:33.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.115 "is_configured": false, 00:20:33.115 "data_offset": 0, 00:20:33.115 "data_size": 0 00:20:33.115 }, 00:20:33.115 { 00:20:33.115 "name": "BaseBdev4", 00:20:33.115 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:33.115 "is_configured": false, 00:20:33.115 "data_offset": 0, 00:20:33.115 "data_size": 0 00:20:33.115 } 00:20:33.115 ] 00:20:33.115 }' 00:20:33.115 13:29:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:33.115 13:29:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:33.685 [2024-07-25 13:29:14.367667] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:33.685 BaseBdev2 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:33.685 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:33.945 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:34.205 [ 00:20:34.205 { 00:20:34.205 "name": "BaseBdev2", 00:20:34.205 "aliases": [ 00:20:34.205 "fcf54f34-afac-4b01-a6a2-f919efef98fb" 00:20:34.205 ], 00:20:34.205 "product_name": "Malloc disk", 00:20:34.205 "block_size": 512, 00:20:34.205 "num_blocks": 65536, 00:20:34.205 "uuid": "fcf54f34-afac-4b01-a6a2-f919efef98fb", 00:20:34.205 "assigned_rate_limits": { 00:20:34.205 "rw_ios_per_sec": 0, 00:20:34.205 "rw_mbytes_per_sec": 0, 00:20:34.205 "r_mbytes_per_sec": 0, 00:20:34.205 "w_mbytes_per_sec": 0 00:20:34.205 }, 00:20:34.205 "claimed": true, 00:20:34.205 "claim_type": "exclusive_write", 00:20:34.205 "zoned": false, 00:20:34.205 "supported_io_types": { 00:20:34.205 "read": true, 00:20:34.205 "write": true, 00:20:34.205 "unmap": true, 00:20:34.205 "flush": true, 00:20:34.205 "reset": true, 00:20:34.205 "nvme_admin": false, 00:20:34.205 "nvme_io": false, 00:20:34.205 "nvme_io_md": false, 00:20:34.205 "write_zeroes": true, 00:20:34.205 "zcopy": true, 00:20:34.205 "get_zone_info": false, 00:20:34.205 "zone_management": false, 00:20:34.205 "zone_append": false, 00:20:34.205 "compare": false, 00:20:34.205 "compare_and_write": false, 00:20:34.205 "abort": true, 00:20:34.205 "seek_hole": false, 00:20:34.205 "seek_data": false, 00:20:34.205 "copy": true, 00:20:34.205 "nvme_iov_md": false 00:20:34.205 }, 00:20:34.205 "memory_domains": [ 00:20:34.205 { 00:20:34.205 "dma_device_id": "system", 00:20:34.205 "dma_device_type": 1 00:20:34.205 }, 00:20:34.205 { 00:20:34.205 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:34.205 "dma_device_type": 2 00:20:34.205 } 00:20:34.205 ], 00:20:34.205 "driver_specific": {} 00:20:34.205 } 00:20:34.205 ] 00:20:34.205 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:34.205 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:34.205 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:34.205 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:34.205 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.206 "name": "Existed_Raid", 00:20:34.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.206 "strip_size_kb": 0, 00:20:34.206 "state": "configuring", 00:20:34.206 "raid_level": "raid1", 00:20:34.206 "superblock": false, 00:20:34.206 "num_base_bdevs": 4, 00:20:34.206 "num_base_bdevs_discovered": 2, 00:20:34.206 "num_base_bdevs_operational": 4, 00:20:34.206 "base_bdevs_list": [ 00:20:34.206 { 00:20:34.206 "name": "BaseBdev1", 00:20:34.206 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:34.206 "is_configured": true, 00:20:34.206 "data_offset": 0, 00:20:34.206 "data_size": 65536 00:20:34.206 }, 00:20:34.206 { 00:20:34.206 "name": "BaseBdev2", 00:20:34.206 "uuid": "fcf54f34-afac-4b01-a6a2-f919efef98fb", 00:20:34.206 "is_configured": true, 00:20:34.206 "data_offset": 0, 00:20:34.206 "data_size": 65536 00:20:34.206 }, 00:20:34.206 { 00:20:34.206 "name": "BaseBdev3", 00:20:34.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.206 "is_configured": false, 00:20:34.206 "data_offset": 0, 00:20:34.206 "data_size": 0 00:20:34.206 }, 00:20:34.206 { 00:20:34.206 "name": "BaseBdev4", 00:20:34.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:34.206 "is_configured": false, 00:20:34.206 "data_offset": 0, 00:20:34.206 "data_size": 0 00:20:34.206 } 00:20:34.206 ] 00:20:34.206 }' 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.206 13:29:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:34.775 13:29:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:35.344 [2024-07-25 13:29:16.036724] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:35.344 BaseBdev3 00:20:35.344 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:35.344 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:35.344 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:35.344 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:35.344 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:35.344 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:35.344 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:35.603 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:35.863 [ 00:20:35.863 { 00:20:35.863 "name": "BaseBdev3", 00:20:35.863 "aliases": [ 00:20:35.863 "8bb2f882-a162-47fb-8624-7c9bb4062b74" 00:20:35.863 ], 00:20:35.863 "product_name": "Malloc disk", 00:20:35.863 "block_size": 512, 00:20:35.863 "num_blocks": 65536, 00:20:35.863 "uuid": "8bb2f882-a162-47fb-8624-7c9bb4062b74", 00:20:35.863 "assigned_rate_limits": { 00:20:35.863 "rw_ios_per_sec": 0, 00:20:35.863 "rw_mbytes_per_sec": 0, 00:20:35.863 "r_mbytes_per_sec": 0, 00:20:35.863 "w_mbytes_per_sec": 0 00:20:35.863 }, 00:20:35.863 "claimed": true, 00:20:35.863 "claim_type": "exclusive_write", 00:20:35.863 "zoned": false, 00:20:35.863 "supported_io_types": { 00:20:35.863 "read": true, 00:20:35.863 "write": true, 00:20:35.863 "unmap": true, 00:20:35.863 "flush": true, 00:20:35.863 "reset": true, 00:20:35.863 "nvme_admin": false, 00:20:35.863 "nvme_io": false, 00:20:35.863 "nvme_io_md": false, 00:20:35.863 "write_zeroes": true, 00:20:35.863 "zcopy": true, 00:20:35.863 "get_zone_info": false, 00:20:35.863 "zone_management": false, 00:20:35.863 "zone_append": false, 00:20:35.863 "compare": false, 00:20:35.863 "compare_and_write": false, 00:20:35.863 "abort": true, 00:20:35.863 "seek_hole": false, 00:20:35.863 "seek_data": false, 00:20:35.863 "copy": true, 00:20:35.863 "nvme_iov_md": false 00:20:35.863 }, 00:20:35.863 "memory_domains": [ 00:20:35.863 { 00:20:35.863 "dma_device_id": "system", 00:20:35.863 "dma_device_type": 1 00:20:35.863 }, 00:20:35.863 { 00:20:35.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:35.863 "dma_device_type": 2 00:20:35.863 } 00:20:35.863 ], 00:20:35.863 "driver_specific": {} 00:20:35.863 } 00:20:35.863 ] 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:35.863 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.122 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.122 "name": "Existed_Raid", 00:20:36.122 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.122 "strip_size_kb": 0, 00:20:36.122 "state": "configuring", 00:20:36.122 "raid_level": "raid1", 00:20:36.122 "superblock": false, 00:20:36.122 "num_base_bdevs": 4, 00:20:36.122 "num_base_bdevs_discovered": 3, 00:20:36.122 "num_base_bdevs_operational": 4, 00:20:36.123 "base_bdevs_list": [ 00:20:36.123 { 00:20:36.123 "name": "BaseBdev1", 00:20:36.123 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:36.123 "is_configured": true, 00:20:36.123 "data_offset": 0, 00:20:36.123 "data_size": 65536 00:20:36.123 }, 00:20:36.123 { 00:20:36.123 "name": "BaseBdev2", 00:20:36.123 "uuid": "fcf54f34-afac-4b01-a6a2-f919efef98fb", 00:20:36.123 "is_configured": true, 00:20:36.123 "data_offset": 0, 00:20:36.123 "data_size": 65536 00:20:36.123 }, 00:20:36.123 { 00:20:36.123 "name": "BaseBdev3", 00:20:36.123 "uuid": "8bb2f882-a162-47fb-8624-7c9bb4062b74", 00:20:36.123 "is_configured": true, 00:20:36.123 "data_offset": 0, 00:20:36.123 "data_size": 65536 00:20:36.123 }, 00:20:36.123 { 00:20:36.123 "name": "BaseBdev4", 00:20:36.123 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:36.123 "is_configured": false, 00:20:36.123 "data_offset": 0, 00:20:36.123 "data_size": 0 00:20:36.123 } 00:20:36.123 ] 00:20:36.123 }' 00:20:36.123 13:29:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.123 13:29:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:36.692 [2024-07-25 13:29:17.429067] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:36.692 [2024-07-25 13:29:17.429092] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2790fd0 00:20:36.692 [2024-07-25 13:29:17.429096] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:36.692 [2024-07-25 13:29:17.429246] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x29358e0 00:20:36.692 [2024-07-25 13:29:17.429346] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2790fd0 00:20:36.692 [2024-07-25 13:29:17.429352] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2790fd0 00:20:36.692 [2024-07-25 13:29:17.429474] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:36.692 BaseBdev4 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:36.692 13:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:37.262 13:29:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:37.522 [ 00:20:37.522 { 00:20:37.522 "name": "BaseBdev4", 00:20:37.522 "aliases": [ 00:20:37.522 "b4f5f028-7c6d-449f-a338-4c16f99bd934" 00:20:37.522 ], 00:20:37.522 "product_name": "Malloc disk", 00:20:37.522 "block_size": 512, 00:20:37.522 "num_blocks": 65536, 00:20:37.522 "uuid": "b4f5f028-7c6d-449f-a338-4c16f99bd934", 00:20:37.522 "assigned_rate_limits": { 00:20:37.522 "rw_ios_per_sec": 0, 00:20:37.522 "rw_mbytes_per_sec": 0, 00:20:37.522 "r_mbytes_per_sec": 0, 00:20:37.522 "w_mbytes_per_sec": 0 00:20:37.522 }, 00:20:37.522 "claimed": true, 00:20:37.522 "claim_type": "exclusive_write", 00:20:37.522 "zoned": false, 00:20:37.522 "supported_io_types": { 00:20:37.522 "read": true, 00:20:37.522 "write": true, 00:20:37.522 "unmap": true, 00:20:37.522 "flush": true, 00:20:37.522 "reset": true, 00:20:37.522 "nvme_admin": false, 00:20:37.522 "nvme_io": false, 00:20:37.522 "nvme_io_md": false, 00:20:37.522 "write_zeroes": true, 00:20:37.522 "zcopy": true, 00:20:37.522 "get_zone_info": false, 00:20:37.522 "zone_management": false, 00:20:37.522 "zone_append": false, 00:20:37.522 "compare": false, 00:20:37.522 "compare_and_write": false, 00:20:37.522 "abort": true, 00:20:37.522 "seek_hole": false, 00:20:37.522 "seek_data": false, 00:20:37.522 "copy": true, 00:20:37.522 "nvme_iov_md": false 00:20:37.522 }, 00:20:37.522 "memory_domains": [ 00:20:37.522 { 00:20:37.522 "dma_device_id": "system", 00:20:37.522 "dma_device_type": 1 00:20:37.522 }, 00:20:37.522 { 00:20:37.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:37.522 "dma_device_type": 2 00:20:37.522 } 00:20:37.522 ], 00:20:37.522 "driver_specific": {} 00:20:37.522 } 00:20:37.522 ] 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.522 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:37.781 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:37.782 "name": "Existed_Raid", 00:20:37.782 "uuid": "58d933f6-74d5-4f0a-b51e-9d8f1844675f", 00:20:37.782 "strip_size_kb": 0, 00:20:37.782 "state": "online", 00:20:37.782 "raid_level": "raid1", 00:20:37.782 "superblock": false, 00:20:37.782 "num_base_bdevs": 4, 00:20:37.782 "num_base_bdevs_discovered": 4, 00:20:37.782 "num_base_bdevs_operational": 4, 00:20:37.782 "base_bdevs_list": [ 00:20:37.782 { 00:20:37.782 "name": "BaseBdev1", 00:20:37.782 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:37.782 "is_configured": true, 00:20:37.782 "data_offset": 0, 00:20:37.782 "data_size": 65536 00:20:37.782 }, 00:20:37.782 { 00:20:37.782 "name": "BaseBdev2", 00:20:37.782 "uuid": "fcf54f34-afac-4b01-a6a2-f919efef98fb", 00:20:37.782 "is_configured": true, 00:20:37.782 "data_offset": 0, 00:20:37.782 "data_size": 65536 00:20:37.782 }, 00:20:37.782 { 00:20:37.782 "name": "BaseBdev3", 00:20:37.782 "uuid": "8bb2f882-a162-47fb-8624-7c9bb4062b74", 00:20:37.782 "is_configured": true, 00:20:37.782 "data_offset": 0, 00:20:37.782 "data_size": 65536 00:20:37.782 }, 00:20:37.782 { 00:20:37.782 "name": "BaseBdev4", 00:20:37.782 "uuid": "b4f5f028-7c6d-449f-a338-4c16f99bd934", 00:20:37.782 "is_configured": true, 00:20:37.782 "data_offset": 0, 00:20:37.782 "data_size": 65536 00:20:37.782 } 00:20:37.782 ] 00:20:37.782 }' 00:20:37.782 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:37.782 13:29:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:38.402 13:29:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:38.402 [2024-07-25 13:29:19.101582] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:38.402 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:38.402 "name": "Existed_Raid", 00:20:38.402 "aliases": [ 00:20:38.402 "58d933f6-74d5-4f0a-b51e-9d8f1844675f" 00:20:38.402 ], 00:20:38.402 "product_name": "Raid Volume", 00:20:38.402 "block_size": 512, 00:20:38.402 "num_blocks": 65536, 00:20:38.402 "uuid": "58d933f6-74d5-4f0a-b51e-9d8f1844675f", 00:20:38.402 "assigned_rate_limits": { 00:20:38.402 "rw_ios_per_sec": 0, 00:20:38.402 "rw_mbytes_per_sec": 0, 00:20:38.402 "r_mbytes_per_sec": 0, 00:20:38.402 "w_mbytes_per_sec": 0 00:20:38.402 }, 00:20:38.402 "claimed": false, 00:20:38.402 "zoned": false, 00:20:38.402 "supported_io_types": { 00:20:38.402 "read": true, 00:20:38.402 "write": true, 00:20:38.402 "unmap": false, 00:20:38.402 "flush": false, 00:20:38.402 "reset": true, 00:20:38.402 "nvme_admin": false, 00:20:38.402 "nvme_io": false, 00:20:38.402 "nvme_io_md": false, 00:20:38.402 "write_zeroes": true, 00:20:38.402 "zcopy": false, 00:20:38.402 "get_zone_info": false, 00:20:38.402 "zone_management": false, 00:20:38.402 "zone_append": false, 00:20:38.402 "compare": false, 00:20:38.402 "compare_and_write": false, 00:20:38.402 "abort": false, 00:20:38.402 "seek_hole": false, 00:20:38.402 "seek_data": false, 00:20:38.402 "copy": false, 00:20:38.402 "nvme_iov_md": false 00:20:38.402 }, 00:20:38.402 "memory_domains": [ 00:20:38.402 { 00:20:38.402 "dma_device_id": "system", 00:20:38.402 "dma_device_type": 1 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.402 "dma_device_type": 2 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "dma_device_id": "system", 00:20:38.402 "dma_device_type": 1 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.402 "dma_device_type": 2 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "dma_device_id": "system", 00:20:38.402 "dma_device_type": 1 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.402 "dma_device_type": 2 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "dma_device_id": "system", 00:20:38.402 "dma_device_type": 1 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.402 "dma_device_type": 2 00:20:38.402 } 00:20:38.402 ], 00:20:38.402 "driver_specific": { 00:20:38.402 "raid": { 00:20:38.402 "uuid": "58d933f6-74d5-4f0a-b51e-9d8f1844675f", 00:20:38.402 "strip_size_kb": 0, 00:20:38.402 "state": "online", 00:20:38.402 "raid_level": "raid1", 00:20:38.402 "superblock": false, 00:20:38.402 "num_base_bdevs": 4, 00:20:38.402 "num_base_bdevs_discovered": 4, 00:20:38.402 "num_base_bdevs_operational": 4, 00:20:38.402 "base_bdevs_list": [ 00:20:38.402 { 00:20:38.402 "name": "BaseBdev1", 00:20:38.402 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:38.402 "is_configured": true, 00:20:38.402 "data_offset": 0, 00:20:38.402 "data_size": 65536 00:20:38.402 }, 00:20:38.402 { 00:20:38.402 "name": "BaseBdev2", 00:20:38.403 "uuid": "fcf54f34-afac-4b01-a6a2-f919efef98fb", 00:20:38.403 "is_configured": true, 00:20:38.403 "data_offset": 0, 00:20:38.403 "data_size": 65536 00:20:38.403 }, 00:20:38.403 { 00:20:38.403 "name": "BaseBdev3", 00:20:38.403 "uuid": "8bb2f882-a162-47fb-8624-7c9bb4062b74", 00:20:38.403 "is_configured": true, 00:20:38.403 "data_offset": 0, 00:20:38.403 "data_size": 65536 00:20:38.403 }, 00:20:38.403 { 00:20:38.403 "name": "BaseBdev4", 00:20:38.403 "uuid": "b4f5f028-7c6d-449f-a338-4c16f99bd934", 00:20:38.403 "is_configured": true, 00:20:38.403 "data_offset": 0, 00:20:38.403 "data_size": 65536 00:20:38.403 } 00:20:38.403 ] 00:20:38.403 } 00:20:38.403 } 00:20:38.403 }' 00:20:38.403 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:38.403 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:38.403 BaseBdev2 00:20:38.403 BaseBdev3 00:20:38.403 BaseBdev4' 00:20:38.403 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:38.403 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:38.403 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:38.663 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:38.663 "name": "BaseBdev1", 00:20:38.663 "aliases": [ 00:20:38.663 "a2e66ddd-f076-4d34-9792-d2dec51ed018" 00:20:38.663 ], 00:20:38.663 "product_name": "Malloc disk", 00:20:38.663 "block_size": 512, 00:20:38.663 "num_blocks": 65536, 00:20:38.663 "uuid": "a2e66ddd-f076-4d34-9792-d2dec51ed018", 00:20:38.663 "assigned_rate_limits": { 00:20:38.663 "rw_ios_per_sec": 0, 00:20:38.663 "rw_mbytes_per_sec": 0, 00:20:38.663 "r_mbytes_per_sec": 0, 00:20:38.663 "w_mbytes_per_sec": 0 00:20:38.663 }, 00:20:38.663 "claimed": true, 00:20:38.663 "claim_type": "exclusive_write", 00:20:38.663 "zoned": false, 00:20:38.663 "supported_io_types": { 00:20:38.663 "read": true, 00:20:38.663 "write": true, 00:20:38.663 "unmap": true, 00:20:38.663 "flush": true, 00:20:38.663 "reset": true, 00:20:38.663 "nvme_admin": false, 00:20:38.663 "nvme_io": false, 00:20:38.663 "nvme_io_md": false, 00:20:38.663 "write_zeroes": true, 00:20:38.663 "zcopy": true, 00:20:38.663 "get_zone_info": false, 00:20:38.663 "zone_management": false, 00:20:38.663 "zone_append": false, 00:20:38.663 "compare": false, 00:20:38.663 "compare_and_write": false, 00:20:38.663 "abort": true, 00:20:38.663 "seek_hole": false, 00:20:38.663 "seek_data": false, 00:20:38.663 "copy": true, 00:20:38.663 "nvme_iov_md": false 00:20:38.663 }, 00:20:38.663 "memory_domains": [ 00:20:38.663 { 00:20:38.663 "dma_device_id": "system", 00:20:38.663 "dma_device_type": 1 00:20:38.663 }, 00:20:38.663 { 00:20:38.663 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:38.663 "dma_device_type": 2 00:20:38.663 } 00:20:38.663 ], 00:20:38.663 "driver_specific": {} 00:20:38.663 }' 00:20:38.663 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.663 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:38.923 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:39.184 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:39.184 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:39.184 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:39.184 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:39.184 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:39.184 "name": "BaseBdev2", 00:20:39.184 "aliases": [ 00:20:39.184 "fcf54f34-afac-4b01-a6a2-f919efef98fb" 00:20:39.184 ], 00:20:39.184 "product_name": "Malloc disk", 00:20:39.184 "block_size": 512, 00:20:39.184 "num_blocks": 65536, 00:20:39.184 "uuid": "fcf54f34-afac-4b01-a6a2-f919efef98fb", 00:20:39.184 "assigned_rate_limits": { 00:20:39.184 "rw_ios_per_sec": 0, 00:20:39.184 "rw_mbytes_per_sec": 0, 00:20:39.184 "r_mbytes_per_sec": 0, 00:20:39.184 "w_mbytes_per_sec": 0 00:20:39.184 }, 00:20:39.184 "claimed": true, 00:20:39.184 "claim_type": "exclusive_write", 00:20:39.184 "zoned": false, 00:20:39.184 "supported_io_types": { 00:20:39.184 "read": true, 00:20:39.184 "write": true, 00:20:39.184 "unmap": true, 00:20:39.184 "flush": true, 00:20:39.184 "reset": true, 00:20:39.184 "nvme_admin": false, 00:20:39.184 "nvme_io": false, 00:20:39.184 "nvme_io_md": false, 00:20:39.184 "write_zeroes": true, 00:20:39.184 "zcopy": true, 00:20:39.184 "get_zone_info": false, 00:20:39.184 "zone_management": false, 00:20:39.184 "zone_append": false, 00:20:39.184 "compare": false, 00:20:39.184 "compare_and_write": false, 00:20:39.184 "abort": true, 00:20:39.184 "seek_hole": false, 00:20:39.184 "seek_data": false, 00:20:39.184 "copy": true, 00:20:39.184 "nvme_iov_md": false 00:20:39.184 }, 00:20:39.184 "memory_domains": [ 00:20:39.184 { 00:20:39.184 "dma_device_id": "system", 00:20:39.184 "dma_device_type": 1 00:20:39.184 }, 00:20:39.184 { 00:20:39.184 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.184 "dma_device_type": 2 00:20:39.184 } 00:20:39.184 ], 00:20:39.184 "driver_specific": {} 00:20:39.184 }' 00:20:39.184 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:39.184 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:39.444 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:39.444 13:29:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:39.444 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:39.444 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:39.444 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.444 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.444 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:39.444 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:39.444 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:39.703 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:39.703 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:39.703 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:39.703 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:39.703 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:39.703 "name": "BaseBdev3", 00:20:39.703 "aliases": [ 00:20:39.703 "8bb2f882-a162-47fb-8624-7c9bb4062b74" 00:20:39.703 ], 00:20:39.703 "product_name": "Malloc disk", 00:20:39.703 "block_size": 512, 00:20:39.703 "num_blocks": 65536, 00:20:39.703 "uuid": "8bb2f882-a162-47fb-8624-7c9bb4062b74", 00:20:39.703 "assigned_rate_limits": { 00:20:39.703 "rw_ios_per_sec": 0, 00:20:39.703 "rw_mbytes_per_sec": 0, 00:20:39.703 "r_mbytes_per_sec": 0, 00:20:39.703 "w_mbytes_per_sec": 0 00:20:39.703 }, 00:20:39.703 "claimed": true, 00:20:39.703 "claim_type": "exclusive_write", 00:20:39.703 "zoned": false, 00:20:39.703 "supported_io_types": { 00:20:39.703 "read": true, 00:20:39.703 "write": true, 00:20:39.703 "unmap": true, 00:20:39.703 "flush": true, 00:20:39.703 "reset": true, 00:20:39.703 "nvme_admin": false, 00:20:39.703 "nvme_io": false, 00:20:39.703 "nvme_io_md": false, 00:20:39.703 "write_zeroes": true, 00:20:39.703 "zcopy": true, 00:20:39.703 "get_zone_info": false, 00:20:39.703 "zone_management": false, 00:20:39.703 "zone_append": false, 00:20:39.703 "compare": false, 00:20:39.703 "compare_and_write": false, 00:20:39.703 "abort": true, 00:20:39.703 "seek_hole": false, 00:20:39.703 "seek_data": false, 00:20:39.703 "copy": true, 00:20:39.703 "nvme_iov_md": false 00:20:39.703 }, 00:20:39.703 "memory_domains": [ 00:20:39.703 { 00:20:39.703 "dma_device_id": "system", 00:20:39.703 "dma_device_type": 1 00:20:39.703 }, 00:20:39.703 { 00:20:39.703 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:39.703 "dma_device_type": 2 00:20:39.703 } 00:20:39.703 ], 00:20:39.703 "driver_specific": {} 00:20:39.703 }' 00:20:39.703 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:39.962 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:40.220 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:40.220 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:40.220 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:40.220 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:40.220 13:29:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:40.480 "name": "BaseBdev4", 00:20:40.480 "aliases": [ 00:20:40.480 "b4f5f028-7c6d-449f-a338-4c16f99bd934" 00:20:40.480 ], 00:20:40.480 "product_name": "Malloc disk", 00:20:40.480 "block_size": 512, 00:20:40.480 "num_blocks": 65536, 00:20:40.480 "uuid": "b4f5f028-7c6d-449f-a338-4c16f99bd934", 00:20:40.480 "assigned_rate_limits": { 00:20:40.480 "rw_ios_per_sec": 0, 00:20:40.480 "rw_mbytes_per_sec": 0, 00:20:40.480 "r_mbytes_per_sec": 0, 00:20:40.480 "w_mbytes_per_sec": 0 00:20:40.480 }, 00:20:40.480 "claimed": true, 00:20:40.480 "claim_type": "exclusive_write", 00:20:40.480 "zoned": false, 00:20:40.480 "supported_io_types": { 00:20:40.480 "read": true, 00:20:40.480 "write": true, 00:20:40.480 "unmap": true, 00:20:40.480 "flush": true, 00:20:40.480 "reset": true, 00:20:40.480 "nvme_admin": false, 00:20:40.480 "nvme_io": false, 00:20:40.480 "nvme_io_md": false, 00:20:40.480 "write_zeroes": true, 00:20:40.480 "zcopy": true, 00:20:40.480 "get_zone_info": false, 00:20:40.480 "zone_management": false, 00:20:40.480 "zone_append": false, 00:20:40.480 "compare": false, 00:20:40.480 "compare_and_write": false, 00:20:40.480 "abort": true, 00:20:40.480 "seek_hole": false, 00:20:40.480 "seek_data": false, 00:20:40.480 "copy": true, 00:20:40.480 "nvme_iov_md": false 00:20:40.480 }, 00:20:40.480 "memory_domains": [ 00:20:40.480 { 00:20:40.480 "dma_device_id": "system", 00:20:40.480 "dma_device_type": 1 00:20:40.480 }, 00:20:40.480 { 00:20:40.480 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:40.480 "dma_device_type": 2 00:20:40.480 } 00:20:40.480 ], 00:20:40.480 "driver_specific": {} 00:20:40.480 }' 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:40.480 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:40.739 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:40.739 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:40.739 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:40.739 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:40.739 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:40.999 [2024-07-25 13:29:21.563566] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:40.999 "name": "Existed_Raid", 00:20:40.999 "uuid": "58d933f6-74d5-4f0a-b51e-9d8f1844675f", 00:20:40.999 "strip_size_kb": 0, 00:20:40.999 "state": "online", 00:20:40.999 "raid_level": "raid1", 00:20:40.999 "superblock": false, 00:20:40.999 "num_base_bdevs": 4, 00:20:40.999 "num_base_bdevs_discovered": 3, 00:20:40.999 "num_base_bdevs_operational": 3, 00:20:40.999 "base_bdevs_list": [ 00:20:40.999 { 00:20:40.999 "name": null, 00:20:40.999 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:40.999 "is_configured": false, 00:20:40.999 "data_offset": 0, 00:20:40.999 "data_size": 65536 00:20:40.999 }, 00:20:40.999 { 00:20:40.999 "name": "BaseBdev2", 00:20:40.999 "uuid": "fcf54f34-afac-4b01-a6a2-f919efef98fb", 00:20:40.999 "is_configured": true, 00:20:40.999 "data_offset": 0, 00:20:40.999 "data_size": 65536 00:20:40.999 }, 00:20:40.999 { 00:20:40.999 "name": "BaseBdev3", 00:20:40.999 "uuid": "8bb2f882-a162-47fb-8624-7c9bb4062b74", 00:20:40.999 "is_configured": true, 00:20:40.999 "data_offset": 0, 00:20:40.999 "data_size": 65536 00:20:40.999 }, 00:20:40.999 { 00:20:40.999 "name": "BaseBdev4", 00:20:40.999 "uuid": "b4f5f028-7c6d-449f-a338-4c16f99bd934", 00:20:40.999 "is_configured": true, 00:20:40.999 "data_offset": 0, 00:20:40.999 "data_size": 65536 00:20:40.999 } 00:20:40.999 ] 00:20:40.999 }' 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:40.999 13:29:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:41.568 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:41.568 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:41.568 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.568 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:41.828 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:41.828 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:41.828 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:42.089 [2024-07-25 13:29:22.650318] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:42.089 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:42.089 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:42.089 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.089 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:42.089 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:42.089 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:42.089 13:29:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:42.349 [2024-07-25 13:29:23.025075] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:42.349 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:42.349 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:42.349 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.349 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:42.608 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:42.608 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:42.608 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:42.867 [2024-07-25 13:29:23.411817] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:42.867 [2024-07-25 13:29:23.411876] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:42.867 [2024-07-25 13:29:23.417949] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:42.867 [2024-07-25 13:29:23.417974] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:42.867 [2024-07-25 13:29:23.417980] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2790fd0 name Existed_Raid, state offline 00:20:42.867 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:42.867 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:42.867 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:42.867 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:42.868 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:42.868 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:42.868 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:42.868 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:42.868 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:42.868 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:43.127 BaseBdev2 00:20:43.127 13:29:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:43.127 13:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:20:43.127 13:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:43.127 13:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:43.127 13:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:43.127 13:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:43.127 13:29:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:43.386 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:43.646 [ 00:20:43.646 { 00:20:43.646 "name": "BaseBdev2", 00:20:43.646 "aliases": [ 00:20:43.646 "428abfd3-bb43-42a7-83a3-4b791a5d041a" 00:20:43.646 ], 00:20:43.646 "product_name": "Malloc disk", 00:20:43.646 "block_size": 512, 00:20:43.646 "num_blocks": 65536, 00:20:43.646 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:43.646 "assigned_rate_limits": { 00:20:43.646 "rw_ios_per_sec": 0, 00:20:43.646 "rw_mbytes_per_sec": 0, 00:20:43.646 "r_mbytes_per_sec": 0, 00:20:43.646 "w_mbytes_per_sec": 0 00:20:43.646 }, 00:20:43.646 "claimed": false, 00:20:43.646 "zoned": false, 00:20:43.646 "supported_io_types": { 00:20:43.646 "read": true, 00:20:43.646 "write": true, 00:20:43.646 "unmap": true, 00:20:43.646 "flush": true, 00:20:43.646 "reset": true, 00:20:43.646 "nvme_admin": false, 00:20:43.646 "nvme_io": false, 00:20:43.646 "nvme_io_md": false, 00:20:43.646 "write_zeroes": true, 00:20:43.646 "zcopy": true, 00:20:43.646 "get_zone_info": false, 00:20:43.646 "zone_management": false, 00:20:43.646 "zone_append": false, 00:20:43.646 "compare": false, 00:20:43.646 "compare_and_write": false, 00:20:43.646 "abort": true, 00:20:43.646 "seek_hole": false, 00:20:43.646 "seek_data": false, 00:20:43.646 "copy": true, 00:20:43.646 "nvme_iov_md": false 00:20:43.646 }, 00:20:43.646 "memory_domains": [ 00:20:43.646 { 00:20:43.646 "dma_device_id": "system", 00:20:43.646 "dma_device_type": 1 00:20:43.646 }, 00:20:43.646 { 00:20:43.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.646 "dma_device_type": 2 00:20:43.646 } 00:20:43.646 ], 00:20:43.646 "driver_specific": {} 00:20:43.646 } 00:20:43.646 ] 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:43.646 BaseBdev3 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:43.646 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:43.905 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:44.163 [ 00:20:44.163 { 00:20:44.163 "name": "BaseBdev3", 00:20:44.163 "aliases": [ 00:20:44.163 "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c" 00:20:44.163 ], 00:20:44.163 "product_name": "Malloc disk", 00:20:44.163 "block_size": 512, 00:20:44.163 "num_blocks": 65536, 00:20:44.163 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:44.163 "assigned_rate_limits": { 00:20:44.163 "rw_ios_per_sec": 0, 00:20:44.163 "rw_mbytes_per_sec": 0, 00:20:44.163 "r_mbytes_per_sec": 0, 00:20:44.163 "w_mbytes_per_sec": 0 00:20:44.163 }, 00:20:44.163 "claimed": false, 00:20:44.163 "zoned": false, 00:20:44.163 "supported_io_types": { 00:20:44.163 "read": true, 00:20:44.163 "write": true, 00:20:44.163 "unmap": true, 00:20:44.163 "flush": true, 00:20:44.163 "reset": true, 00:20:44.163 "nvme_admin": false, 00:20:44.163 "nvme_io": false, 00:20:44.163 "nvme_io_md": false, 00:20:44.163 "write_zeroes": true, 00:20:44.163 "zcopy": true, 00:20:44.163 "get_zone_info": false, 00:20:44.163 "zone_management": false, 00:20:44.163 "zone_append": false, 00:20:44.163 "compare": false, 00:20:44.163 "compare_and_write": false, 00:20:44.163 "abort": true, 00:20:44.163 "seek_hole": false, 00:20:44.163 "seek_data": false, 00:20:44.163 "copy": true, 00:20:44.163 "nvme_iov_md": false 00:20:44.163 }, 00:20:44.163 "memory_domains": [ 00:20:44.163 { 00:20:44.163 "dma_device_id": "system", 00:20:44.163 "dma_device_type": 1 00:20:44.163 }, 00:20:44.163 { 00:20:44.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.163 "dma_device_type": 2 00:20:44.163 } 00:20:44.163 ], 00:20:44.163 "driver_specific": {} 00:20:44.163 } 00:20:44.163 ] 00:20:44.163 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:44.163 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:44.163 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:44.163 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:44.163 BaseBdev4 00:20:44.422 13:29:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:44.422 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:20:44.422 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:44.422 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:44.422 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:44.422 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:44.422 13:29:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:44.422 13:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:44.681 [ 00:20:44.681 { 00:20:44.681 "name": "BaseBdev4", 00:20:44.681 "aliases": [ 00:20:44.681 "71384f88-21f4-4e4e-97f5-f8f53221ed93" 00:20:44.681 ], 00:20:44.681 "product_name": "Malloc disk", 00:20:44.681 "block_size": 512, 00:20:44.681 "num_blocks": 65536, 00:20:44.681 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:44.681 "assigned_rate_limits": { 00:20:44.681 "rw_ios_per_sec": 0, 00:20:44.681 "rw_mbytes_per_sec": 0, 00:20:44.681 "r_mbytes_per_sec": 0, 00:20:44.681 "w_mbytes_per_sec": 0 00:20:44.681 }, 00:20:44.681 "claimed": false, 00:20:44.681 "zoned": false, 00:20:44.681 "supported_io_types": { 00:20:44.681 "read": true, 00:20:44.681 "write": true, 00:20:44.681 "unmap": true, 00:20:44.681 "flush": true, 00:20:44.681 "reset": true, 00:20:44.681 "nvme_admin": false, 00:20:44.681 "nvme_io": false, 00:20:44.681 "nvme_io_md": false, 00:20:44.681 "write_zeroes": true, 00:20:44.681 "zcopy": true, 00:20:44.681 "get_zone_info": false, 00:20:44.681 "zone_management": false, 00:20:44.681 "zone_append": false, 00:20:44.681 "compare": false, 00:20:44.681 "compare_and_write": false, 00:20:44.681 "abort": true, 00:20:44.681 "seek_hole": false, 00:20:44.681 "seek_data": false, 00:20:44.681 "copy": true, 00:20:44.681 "nvme_iov_md": false 00:20:44.681 }, 00:20:44.681 "memory_domains": [ 00:20:44.681 { 00:20:44.681 "dma_device_id": "system", 00:20:44.681 "dma_device_type": 1 00:20:44.681 }, 00:20:44.681 { 00:20:44.681 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.681 "dma_device_type": 2 00:20:44.681 } 00:20:44.681 ], 00:20:44.681 "driver_specific": {} 00:20:44.681 } 00:20:44.681 ] 00:20:44.681 13:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:44.681 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:44.681 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:44.681 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:44.939 [2024-07-25 13:29:25.474975] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:44.939 [2024-07-25 13:29:25.475005] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:44.939 [2024-07-25 13:29:25.475019] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:44.939 [2024-07-25 13:29:25.476055] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:44.939 [2024-07-25 13:29:25.476087] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:44.939 "name": "Existed_Raid", 00:20:44.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.939 "strip_size_kb": 0, 00:20:44.939 "state": "configuring", 00:20:44.939 "raid_level": "raid1", 00:20:44.939 "superblock": false, 00:20:44.939 "num_base_bdevs": 4, 00:20:44.939 "num_base_bdevs_discovered": 3, 00:20:44.939 "num_base_bdevs_operational": 4, 00:20:44.939 "base_bdevs_list": [ 00:20:44.939 { 00:20:44.939 "name": "BaseBdev1", 00:20:44.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:44.939 "is_configured": false, 00:20:44.939 "data_offset": 0, 00:20:44.939 "data_size": 0 00:20:44.939 }, 00:20:44.939 { 00:20:44.939 "name": "BaseBdev2", 00:20:44.939 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:44.939 "is_configured": true, 00:20:44.939 "data_offset": 0, 00:20:44.939 "data_size": 65536 00:20:44.939 }, 00:20:44.939 { 00:20:44.939 "name": "BaseBdev3", 00:20:44.939 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:44.939 "is_configured": true, 00:20:44.939 "data_offset": 0, 00:20:44.939 "data_size": 65536 00:20:44.939 }, 00:20:44.939 { 00:20:44.939 "name": "BaseBdev4", 00:20:44.939 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:44.939 "is_configured": true, 00:20:44.939 "data_offset": 0, 00:20:44.939 "data_size": 65536 00:20:44.939 } 00:20:44.939 ] 00:20:44.939 }' 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:44.939 13:29:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.507 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:45.766 [2024-07-25 13:29:26.397292] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:45.766 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:46.026 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:46.026 "name": "Existed_Raid", 00:20:46.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.026 "strip_size_kb": 0, 00:20:46.026 "state": "configuring", 00:20:46.026 "raid_level": "raid1", 00:20:46.026 "superblock": false, 00:20:46.026 "num_base_bdevs": 4, 00:20:46.026 "num_base_bdevs_discovered": 2, 00:20:46.026 "num_base_bdevs_operational": 4, 00:20:46.026 "base_bdevs_list": [ 00:20:46.026 { 00:20:46.026 "name": "BaseBdev1", 00:20:46.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:46.026 "is_configured": false, 00:20:46.026 "data_offset": 0, 00:20:46.026 "data_size": 0 00:20:46.026 }, 00:20:46.026 { 00:20:46.026 "name": null, 00:20:46.026 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:46.026 "is_configured": false, 00:20:46.026 "data_offset": 0, 00:20:46.026 "data_size": 65536 00:20:46.026 }, 00:20:46.026 { 00:20:46.026 "name": "BaseBdev3", 00:20:46.026 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:46.026 "is_configured": true, 00:20:46.026 "data_offset": 0, 00:20:46.026 "data_size": 65536 00:20:46.026 }, 00:20:46.026 { 00:20:46.026 "name": "BaseBdev4", 00:20:46.026 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:46.026 "is_configured": true, 00:20:46.026 "data_offset": 0, 00:20:46.026 "data_size": 65536 00:20:46.026 } 00:20:46.026 ] 00:20:46.026 }' 00:20:46.026 13:29:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:46.026 13:29:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:46.595 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:46.595 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:46.595 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:46.595 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:46.855 [2024-07-25 13:29:27.545194] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:46.855 BaseBdev1 00:20:46.855 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:46.855 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:46.855 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:46.855 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:46.855 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:46.855 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:46.855 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:47.114 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:47.374 [ 00:20:47.374 { 00:20:47.374 "name": "BaseBdev1", 00:20:47.374 "aliases": [ 00:20:47.374 "d9143041-dfaf-49fe-a5fc-e1e09af5c403" 00:20:47.374 ], 00:20:47.374 "product_name": "Malloc disk", 00:20:47.374 "block_size": 512, 00:20:47.374 "num_blocks": 65536, 00:20:47.374 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:47.374 "assigned_rate_limits": { 00:20:47.374 "rw_ios_per_sec": 0, 00:20:47.374 "rw_mbytes_per_sec": 0, 00:20:47.374 "r_mbytes_per_sec": 0, 00:20:47.374 "w_mbytes_per_sec": 0 00:20:47.374 }, 00:20:47.374 "claimed": true, 00:20:47.374 "claim_type": "exclusive_write", 00:20:47.374 "zoned": false, 00:20:47.374 "supported_io_types": { 00:20:47.374 "read": true, 00:20:47.374 "write": true, 00:20:47.374 "unmap": true, 00:20:47.374 "flush": true, 00:20:47.374 "reset": true, 00:20:47.374 "nvme_admin": false, 00:20:47.374 "nvme_io": false, 00:20:47.374 "nvme_io_md": false, 00:20:47.374 "write_zeroes": true, 00:20:47.374 "zcopy": true, 00:20:47.374 "get_zone_info": false, 00:20:47.374 "zone_management": false, 00:20:47.374 "zone_append": false, 00:20:47.374 "compare": false, 00:20:47.374 "compare_and_write": false, 00:20:47.374 "abort": true, 00:20:47.374 "seek_hole": false, 00:20:47.374 "seek_data": false, 00:20:47.374 "copy": true, 00:20:47.374 "nvme_iov_md": false 00:20:47.374 }, 00:20:47.374 "memory_domains": [ 00:20:47.374 { 00:20:47.374 "dma_device_id": "system", 00:20:47.374 "dma_device_type": 1 00:20:47.374 }, 00:20:47.374 { 00:20:47.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:47.374 "dma_device_type": 2 00:20:47.374 } 00:20:47.374 ], 00:20:47.374 "driver_specific": {} 00:20:47.374 } 00:20:47.374 ] 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.374 13:29:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:47.374 13:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:47.374 "name": "Existed_Raid", 00:20:47.375 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:47.375 "strip_size_kb": 0, 00:20:47.375 "state": "configuring", 00:20:47.375 "raid_level": "raid1", 00:20:47.375 "superblock": false, 00:20:47.375 "num_base_bdevs": 4, 00:20:47.375 "num_base_bdevs_discovered": 3, 00:20:47.375 "num_base_bdevs_operational": 4, 00:20:47.375 "base_bdevs_list": [ 00:20:47.375 { 00:20:47.375 "name": "BaseBdev1", 00:20:47.375 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:47.375 "is_configured": true, 00:20:47.375 "data_offset": 0, 00:20:47.375 "data_size": 65536 00:20:47.375 }, 00:20:47.375 { 00:20:47.375 "name": null, 00:20:47.375 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:47.375 "is_configured": false, 00:20:47.375 "data_offset": 0, 00:20:47.375 "data_size": 65536 00:20:47.375 }, 00:20:47.375 { 00:20:47.375 "name": "BaseBdev3", 00:20:47.375 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:47.375 "is_configured": true, 00:20:47.375 "data_offset": 0, 00:20:47.375 "data_size": 65536 00:20:47.375 }, 00:20:47.375 { 00:20:47.375 "name": "BaseBdev4", 00:20:47.375 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:47.375 "is_configured": true, 00:20:47.375 "data_offset": 0, 00:20:47.375 "data_size": 65536 00:20:47.375 } 00:20:47.375 ] 00:20:47.375 }' 00:20:47.375 13:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:47.375 13:29:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:47.943 13:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:47.943 13:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:48.202 13:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:48.202 13:29:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:48.462 [2024-07-25 13:29:29.020951] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.462 "name": "Existed_Raid", 00:20:48.462 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:48.462 "strip_size_kb": 0, 00:20:48.462 "state": "configuring", 00:20:48.462 "raid_level": "raid1", 00:20:48.462 "superblock": false, 00:20:48.462 "num_base_bdevs": 4, 00:20:48.462 "num_base_bdevs_discovered": 2, 00:20:48.462 "num_base_bdevs_operational": 4, 00:20:48.462 "base_bdevs_list": [ 00:20:48.462 { 00:20:48.462 "name": "BaseBdev1", 00:20:48.462 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:48.462 "is_configured": true, 00:20:48.462 "data_offset": 0, 00:20:48.462 "data_size": 65536 00:20:48.462 }, 00:20:48.462 { 00:20:48.462 "name": null, 00:20:48.462 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:48.462 "is_configured": false, 00:20:48.462 "data_offset": 0, 00:20:48.462 "data_size": 65536 00:20:48.462 }, 00:20:48.462 { 00:20:48.462 "name": null, 00:20:48.462 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:48.462 "is_configured": false, 00:20:48.462 "data_offset": 0, 00:20:48.462 "data_size": 65536 00:20:48.462 }, 00:20:48.462 { 00:20:48.462 "name": "BaseBdev4", 00:20:48.462 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:48.462 "is_configured": true, 00:20:48.462 "data_offset": 0, 00:20:48.462 "data_size": 65536 00:20:48.462 } 00:20:48.462 ] 00:20:48.462 }' 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.462 13:29:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.029 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.029 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:49.289 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:49.289 13:29:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:49.548 [2024-07-25 13:29:30.139813] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:49.548 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:49.808 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:49.808 "name": "Existed_Raid", 00:20:49.808 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:49.808 "strip_size_kb": 0, 00:20:49.808 "state": "configuring", 00:20:49.808 "raid_level": "raid1", 00:20:49.808 "superblock": false, 00:20:49.808 "num_base_bdevs": 4, 00:20:49.808 "num_base_bdevs_discovered": 3, 00:20:49.808 "num_base_bdevs_operational": 4, 00:20:49.808 "base_bdevs_list": [ 00:20:49.808 { 00:20:49.808 "name": "BaseBdev1", 00:20:49.808 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:49.808 "is_configured": true, 00:20:49.808 "data_offset": 0, 00:20:49.808 "data_size": 65536 00:20:49.808 }, 00:20:49.808 { 00:20:49.808 "name": null, 00:20:49.808 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:49.808 "is_configured": false, 00:20:49.808 "data_offset": 0, 00:20:49.808 "data_size": 65536 00:20:49.808 }, 00:20:49.808 { 00:20:49.808 "name": "BaseBdev3", 00:20:49.808 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:49.808 "is_configured": true, 00:20:49.808 "data_offset": 0, 00:20:49.808 "data_size": 65536 00:20:49.808 }, 00:20:49.808 { 00:20:49.808 "name": "BaseBdev4", 00:20:49.808 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:49.808 "is_configured": true, 00:20:49.808 "data_offset": 0, 00:20:49.808 "data_size": 65536 00:20:49.808 } 00:20:49.808 ] 00:20:49.808 }' 00:20:49.808 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:49.808 13:29:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:50.378 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.378 13:29:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:50.378 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:50.378 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:50.637 [2024-07-25 13:29:31.274680] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:50.637 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:50.896 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:50.896 "name": "Existed_Raid", 00:20:50.896 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:50.896 "strip_size_kb": 0, 00:20:50.896 "state": "configuring", 00:20:50.896 "raid_level": "raid1", 00:20:50.896 "superblock": false, 00:20:50.896 "num_base_bdevs": 4, 00:20:50.896 "num_base_bdevs_discovered": 2, 00:20:50.896 "num_base_bdevs_operational": 4, 00:20:50.896 "base_bdevs_list": [ 00:20:50.896 { 00:20:50.896 "name": null, 00:20:50.896 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:50.896 "is_configured": false, 00:20:50.896 "data_offset": 0, 00:20:50.896 "data_size": 65536 00:20:50.896 }, 00:20:50.897 { 00:20:50.897 "name": null, 00:20:50.897 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:50.897 "is_configured": false, 00:20:50.897 "data_offset": 0, 00:20:50.897 "data_size": 65536 00:20:50.897 }, 00:20:50.897 { 00:20:50.897 "name": "BaseBdev3", 00:20:50.897 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:50.897 "is_configured": true, 00:20:50.897 "data_offset": 0, 00:20:50.897 "data_size": 65536 00:20:50.897 }, 00:20:50.897 { 00:20:50.897 "name": "BaseBdev4", 00:20:50.897 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:50.897 "is_configured": true, 00:20:50.897 "data_offset": 0, 00:20:50.897 "data_size": 65536 00:20:50.897 } 00:20:50.897 ] 00:20:50.897 }' 00:20:50.897 13:29:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:50.897 13:29:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:51.466 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.466 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:51.466 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:51.466 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:51.726 [2024-07-25 13:29:32.343223] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:51.726 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:51.987 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:51.987 "name": "Existed_Raid", 00:20:51.987 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:51.987 "strip_size_kb": 0, 00:20:51.987 "state": "configuring", 00:20:51.987 "raid_level": "raid1", 00:20:51.987 "superblock": false, 00:20:51.987 "num_base_bdevs": 4, 00:20:51.987 "num_base_bdevs_discovered": 3, 00:20:51.987 "num_base_bdevs_operational": 4, 00:20:51.987 "base_bdevs_list": [ 00:20:51.987 { 00:20:51.987 "name": null, 00:20:51.987 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:51.987 "is_configured": false, 00:20:51.987 "data_offset": 0, 00:20:51.987 "data_size": 65536 00:20:51.987 }, 00:20:51.987 { 00:20:51.987 "name": "BaseBdev2", 00:20:51.987 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:51.987 "is_configured": true, 00:20:51.987 "data_offset": 0, 00:20:51.987 "data_size": 65536 00:20:51.987 }, 00:20:51.987 { 00:20:51.987 "name": "BaseBdev3", 00:20:51.987 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:51.987 "is_configured": true, 00:20:51.987 "data_offset": 0, 00:20:51.987 "data_size": 65536 00:20:51.987 }, 00:20:51.987 { 00:20:51.987 "name": "BaseBdev4", 00:20:51.987 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:51.987 "is_configured": true, 00:20:51.987 "data_offset": 0, 00:20:51.987 "data_size": 65536 00:20:51.987 } 00:20:51.987 ] 00:20:51.987 }' 00:20:51.987 13:29:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:51.987 13:29:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:52.556 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.556 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:52.556 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:52.556 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.556 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:52.816 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d9143041-dfaf-49fe-a5fc-e1e09af5c403 00:20:53.077 [2024-07-25 13:29:33.611475] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:53.077 [2024-07-25 13:29:33.611504] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2790c70 00:20:53.077 [2024-07-25 13:29:33.611513] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:53.077 [2024-07-25 13:29:33.611677] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278f4f0 00:20:53.077 [2024-07-25 13:29:33.611777] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2790c70 00:20:53.077 [2024-07-25 13:29:33.611783] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2790c70 00:20:53.077 [2024-07-25 13:29:33.611898] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:53.077 NewBaseBdev 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # local i 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:53.077 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:53.337 [ 00:20:53.337 { 00:20:53.337 "name": "NewBaseBdev", 00:20:53.337 "aliases": [ 00:20:53.337 "d9143041-dfaf-49fe-a5fc-e1e09af5c403" 00:20:53.337 ], 00:20:53.337 "product_name": "Malloc disk", 00:20:53.337 "block_size": 512, 00:20:53.337 "num_blocks": 65536, 00:20:53.337 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:53.337 "assigned_rate_limits": { 00:20:53.337 "rw_ios_per_sec": 0, 00:20:53.337 "rw_mbytes_per_sec": 0, 00:20:53.337 "r_mbytes_per_sec": 0, 00:20:53.337 "w_mbytes_per_sec": 0 00:20:53.337 }, 00:20:53.337 "claimed": true, 00:20:53.337 "claim_type": "exclusive_write", 00:20:53.337 "zoned": false, 00:20:53.337 "supported_io_types": { 00:20:53.337 "read": true, 00:20:53.337 "write": true, 00:20:53.337 "unmap": true, 00:20:53.337 "flush": true, 00:20:53.337 "reset": true, 00:20:53.337 "nvme_admin": false, 00:20:53.337 "nvme_io": false, 00:20:53.337 "nvme_io_md": false, 00:20:53.337 "write_zeroes": true, 00:20:53.337 "zcopy": true, 00:20:53.337 "get_zone_info": false, 00:20:53.337 "zone_management": false, 00:20:53.337 "zone_append": false, 00:20:53.337 "compare": false, 00:20:53.337 "compare_and_write": false, 00:20:53.337 "abort": true, 00:20:53.337 "seek_hole": false, 00:20:53.337 "seek_data": false, 00:20:53.337 "copy": true, 00:20:53.337 "nvme_iov_md": false 00:20:53.337 }, 00:20:53.337 "memory_domains": [ 00:20:53.337 { 00:20:53.337 "dma_device_id": "system", 00:20:53.337 "dma_device_type": 1 00:20:53.337 }, 00:20:53.337 { 00:20:53.337 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:53.337 "dma_device_type": 2 00:20:53.337 } 00:20:53.337 ], 00:20:53.337 "driver_specific": {} 00:20:53.337 } 00:20:53.337 ] 00:20:53.337 13:29:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@907 -- # return 0 00:20:53.337 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:53.337 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:53.337 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:53.337 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:53.338 13:29:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:53.597 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:53.597 "name": "Existed_Raid", 00:20:53.597 "uuid": "96cb3119-d55b-48f2-a403-23383aa7c84c", 00:20:53.597 "strip_size_kb": 0, 00:20:53.597 "state": "online", 00:20:53.597 "raid_level": "raid1", 00:20:53.597 "superblock": false, 00:20:53.597 "num_base_bdevs": 4, 00:20:53.597 "num_base_bdevs_discovered": 4, 00:20:53.597 "num_base_bdevs_operational": 4, 00:20:53.597 "base_bdevs_list": [ 00:20:53.597 { 00:20:53.597 "name": "NewBaseBdev", 00:20:53.597 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:53.597 "is_configured": true, 00:20:53.597 "data_offset": 0, 00:20:53.597 "data_size": 65536 00:20:53.597 }, 00:20:53.597 { 00:20:53.597 "name": "BaseBdev2", 00:20:53.597 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:53.597 "is_configured": true, 00:20:53.597 "data_offset": 0, 00:20:53.597 "data_size": 65536 00:20:53.597 }, 00:20:53.597 { 00:20:53.597 "name": "BaseBdev3", 00:20:53.598 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:53.598 "is_configured": true, 00:20:53.598 "data_offset": 0, 00:20:53.598 "data_size": 65536 00:20:53.598 }, 00:20:53.598 { 00:20:53.598 "name": "BaseBdev4", 00:20:53.598 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:53.598 "is_configured": true, 00:20:53.598 "data_offset": 0, 00:20:53.598 "data_size": 65536 00:20:53.598 } 00:20:53.598 ] 00:20:53.598 }' 00:20:53.598 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:53.598 13:29:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:54.168 [2024-07-25 13:29:34.895000] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:54.168 "name": "Existed_Raid", 00:20:54.168 "aliases": [ 00:20:54.168 "96cb3119-d55b-48f2-a403-23383aa7c84c" 00:20:54.168 ], 00:20:54.168 "product_name": "Raid Volume", 00:20:54.168 "block_size": 512, 00:20:54.168 "num_blocks": 65536, 00:20:54.168 "uuid": "96cb3119-d55b-48f2-a403-23383aa7c84c", 00:20:54.168 "assigned_rate_limits": { 00:20:54.168 "rw_ios_per_sec": 0, 00:20:54.168 "rw_mbytes_per_sec": 0, 00:20:54.168 "r_mbytes_per_sec": 0, 00:20:54.168 "w_mbytes_per_sec": 0 00:20:54.168 }, 00:20:54.168 "claimed": false, 00:20:54.168 "zoned": false, 00:20:54.168 "supported_io_types": { 00:20:54.168 "read": true, 00:20:54.168 "write": true, 00:20:54.168 "unmap": false, 00:20:54.168 "flush": false, 00:20:54.168 "reset": true, 00:20:54.168 "nvme_admin": false, 00:20:54.168 "nvme_io": false, 00:20:54.168 "nvme_io_md": false, 00:20:54.168 "write_zeroes": true, 00:20:54.168 "zcopy": false, 00:20:54.168 "get_zone_info": false, 00:20:54.168 "zone_management": false, 00:20:54.168 "zone_append": false, 00:20:54.168 "compare": false, 00:20:54.168 "compare_and_write": false, 00:20:54.168 "abort": false, 00:20:54.168 "seek_hole": false, 00:20:54.168 "seek_data": false, 00:20:54.168 "copy": false, 00:20:54.168 "nvme_iov_md": false 00:20:54.168 }, 00:20:54.168 "memory_domains": [ 00:20:54.168 { 00:20:54.168 "dma_device_id": "system", 00:20:54.168 "dma_device_type": 1 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.168 "dma_device_type": 2 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "dma_device_id": "system", 00:20:54.168 "dma_device_type": 1 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.168 "dma_device_type": 2 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "dma_device_id": "system", 00:20:54.168 "dma_device_type": 1 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.168 "dma_device_type": 2 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "dma_device_id": "system", 00:20:54.168 "dma_device_type": 1 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.168 "dma_device_type": 2 00:20:54.168 } 00:20:54.168 ], 00:20:54.168 "driver_specific": { 00:20:54.168 "raid": { 00:20:54.168 "uuid": "96cb3119-d55b-48f2-a403-23383aa7c84c", 00:20:54.168 "strip_size_kb": 0, 00:20:54.168 "state": "online", 00:20:54.168 "raid_level": "raid1", 00:20:54.168 "superblock": false, 00:20:54.168 "num_base_bdevs": 4, 00:20:54.168 "num_base_bdevs_discovered": 4, 00:20:54.168 "num_base_bdevs_operational": 4, 00:20:54.168 "base_bdevs_list": [ 00:20:54.168 { 00:20:54.168 "name": "NewBaseBdev", 00:20:54.168 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:54.168 "is_configured": true, 00:20:54.168 "data_offset": 0, 00:20:54.168 "data_size": 65536 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "name": "BaseBdev2", 00:20:54.168 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:54.168 "is_configured": true, 00:20:54.168 "data_offset": 0, 00:20:54.168 "data_size": 65536 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "name": "BaseBdev3", 00:20:54.168 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:54.168 "is_configured": true, 00:20:54.168 "data_offset": 0, 00:20:54.168 "data_size": 65536 00:20:54.168 }, 00:20:54.168 { 00:20:54.168 "name": "BaseBdev4", 00:20:54.168 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:54.168 "is_configured": true, 00:20:54.168 "data_offset": 0, 00:20:54.168 "data_size": 65536 00:20:54.168 } 00:20:54.168 ] 00:20:54.168 } 00:20:54.168 } 00:20:54.168 }' 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:54.168 BaseBdev2 00:20:54.168 BaseBdev3 00:20:54.168 BaseBdev4' 00:20:54.168 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.428 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:54.428 13:29:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.428 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.428 "name": "NewBaseBdev", 00:20:54.428 "aliases": [ 00:20:54.428 "d9143041-dfaf-49fe-a5fc-e1e09af5c403" 00:20:54.428 ], 00:20:54.428 "product_name": "Malloc disk", 00:20:54.428 "block_size": 512, 00:20:54.428 "num_blocks": 65536, 00:20:54.428 "uuid": "d9143041-dfaf-49fe-a5fc-e1e09af5c403", 00:20:54.428 "assigned_rate_limits": { 00:20:54.428 "rw_ios_per_sec": 0, 00:20:54.428 "rw_mbytes_per_sec": 0, 00:20:54.428 "r_mbytes_per_sec": 0, 00:20:54.428 "w_mbytes_per_sec": 0 00:20:54.428 }, 00:20:54.428 "claimed": true, 00:20:54.428 "claim_type": "exclusive_write", 00:20:54.428 "zoned": false, 00:20:54.428 "supported_io_types": { 00:20:54.428 "read": true, 00:20:54.428 "write": true, 00:20:54.428 "unmap": true, 00:20:54.428 "flush": true, 00:20:54.428 "reset": true, 00:20:54.428 "nvme_admin": false, 00:20:54.428 "nvme_io": false, 00:20:54.428 "nvme_io_md": false, 00:20:54.428 "write_zeroes": true, 00:20:54.428 "zcopy": true, 00:20:54.428 "get_zone_info": false, 00:20:54.428 "zone_management": false, 00:20:54.428 "zone_append": false, 00:20:54.428 "compare": false, 00:20:54.428 "compare_and_write": false, 00:20:54.428 "abort": true, 00:20:54.428 "seek_hole": false, 00:20:54.428 "seek_data": false, 00:20:54.428 "copy": true, 00:20:54.428 "nvme_iov_md": false 00:20:54.428 }, 00:20:54.428 "memory_domains": [ 00:20:54.428 { 00:20:54.428 "dma_device_id": "system", 00:20:54.428 "dma_device_type": 1 00:20:54.428 }, 00:20:54.428 { 00:20:54.428 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.428 "dma_device_type": 2 00:20:54.428 } 00:20:54.428 ], 00:20:54.428 "driver_specific": {} 00:20:54.428 }' 00:20:54.428 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.428 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.688 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:54.948 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:54.948 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:54.948 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:54.948 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:54.948 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:54.948 "name": "BaseBdev2", 00:20:54.948 "aliases": [ 00:20:54.948 "428abfd3-bb43-42a7-83a3-4b791a5d041a" 00:20:54.948 ], 00:20:54.948 "product_name": "Malloc disk", 00:20:54.948 "block_size": 512, 00:20:54.948 "num_blocks": 65536, 00:20:54.948 "uuid": "428abfd3-bb43-42a7-83a3-4b791a5d041a", 00:20:54.948 "assigned_rate_limits": { 00:20:54.948 "rw_ios_per_sec": 0, 00:20:54.948 "rw_mbytes_per_sec": 0, 00:20:54.948 "r_mbytes_per_sec": 0, 00:20:54.948 "w_mbytes_per_sec": 0 00:20:54.948 }, 00:20:54.948 "claimed": true, 00:20:54.948 "claim_type": "exclusive_write", 00:20:54.948 "zoned": false, 00:20:54.948 "supported_io_types": { 00:20:54.948 "read": true, 00:20:54.948 "write": true, 00:20:54.948 "unmap": true, 00:20:54.948 "flush": true, 00:20:54.948 "reset": true, 00:20:54.948 "nvme_admin": false, 00:20:54.948 "nvme_io": false, 00:20:54.948 "nvme_io_md": false, 00:20:54.948 "write_zeroes": true, 00:20:54.948 "zcopy": true, 00:20:54.948 "get_zone_info": false, 00:20:54.948 "zone_management": false, 00:20:54.948 "zone_append": false, 00:20:54.948 "compare": false, 00:20:54.948 "compare_and_write": false, 00:20:54.948 "abort": true, 00:20:54.948 "seek_hole": false, 00:20:54.948 "seek_data": false, 00:20:54.948 "copy": true, 00:20:54.948 "nvme_iov_md": false 00:20:54.948 }, 00:20:54.948 "memory_domains": [ 00:20:54.948 { 00:20:54.948 "dma_device_id": "system", 00:20:54.948 "dma_device_type": 1 00:20:54.948 }, 00:20:54.948 { 00:20:54.948 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:54.948 "dma_device_type": 2 00:20:54.948 } 00:20:54.948 ], 00:20:54.948 "driver_specific": {} 00:20:54.948 }' 00:20:54.948 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:54.948 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.208 13:29:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.468 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.468 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.468 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:55.468 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.468 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.468 "name": "BaseBdev3", 00:20:55.468 "aliases": [ 00:20:55.468 "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c" 00:20:55.468 ], 00:20:55.468 "product_name": "Malloc disk", 00:20:55.468 "block_size": 512, 00:20:55.468 "num_blocks": 65536, 00:20:55.468 "uuid": "7ea1a940-8cc2-4c8d-9d70-20b3daf5613c", 00:20:55.468 "assigned_rate_limits": { 00:20:55.468 "rw_ios_per_sec": 0, 00:20:55.468 "rw_mbytes_per_sec": 0, 00:20:55.468 "r_mbytes_per_sec": 0, 00:20:55.468 "w_mbytes_per_sec": 0 00:20:55.468 }, 00:20:55.468 "claimed": true, 00:20:55.468 "claim_type": "exclusive_write", 00:20:55.468 "zoned": false, 00:20:55.468 "supported_io_types": { 00:20:55.468 "read": true, 00:20:55.468 "write": true, 00:20:55.468 "unmap": true, 00:20:55.468 "flush": true, 00:20:55.468 "reset": true, 00:20:55.468 "nvme_admin": false, 00:20:55.468 "nvme_io": false, 00:20:55.468 "nvme_io_md": false, 00:20:55.468 "write_zeroes": true, 00:20:55.468 "zcopy": true, 00:20:55.468 "get_zone_info": false, 00:20:55.468 "zone_management": false, 00:20:55.468 "zone_append": false, 00:20:55.468 "compare": false, 00:20:55.468 "compare_and_write": false, 00:20:55.468 "abort": true, 00:20:55.468 "seek_hole": false, 00:20:55.468 "seek_data": false, 00:20:55.468 "copy": true, 00:20:55.468 "nvme_iov_md": false 00:20:55.468 }, 00:20:55.468 "memory_domains": [ 00:20:55.468 { 00:20:55.468 "dma_device_id": "system", 00:20:55.468 "dma_device_type": 1 00:20:55.468 }, 00:20:55.468 { 00:20:55.468 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.468 "dma_device_type": 2 00:20:55.468 } 00:20:55.468 ], 00:20:55.468 "driver_specific": {} 00:20:55.468 }' 00:20:55.468 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.468 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.727 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:55.987 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:55.987 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:55.987 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:55.987 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:55.987 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:55.987 "name": "BaseBdev4", 00:20:55.987 "aliases": [ 00:20:55.987 "71384f88-21f4-4e4e-97f5-f8f53221ed93" 00:20:55.987 ], 00:20:55.987 "product_name": "Malloc disk", 00:20:55.987 "block_size": 512, 00:20:55.987 "num_blocks": 65536, 00:20:55.987 "uuid": "71384f88-21f4-4e4e-97f5-f8f53221ed93", 00:20:55.987 "assigned_rate_limits": { 00:20:55.987 "rw_ios_per_sec": 0, 00:20:55.987 "rw_mbytes_per_sec": 0, 00:20:55.987 "r_mbytes_per_sec": 0, 00:20:55.987 "w_mbytes_per_sec": 0 00:20:55.987 }, 00:20:55.987 "claimed": true, 00:20:55.987 "claim_type": "exclusive_write", 00:20:55.987 "zoned": false, 00:20:55.987 "supported_io_types": { 00:20:55.987 "read": true, 00:20:55.987 "write": true, 00:20:55.987 "unmap": true, 00:20:55.987 "flush": true, 00:20:55.987 "reset": true, 00:20:55.987 "nvme_admin": false, 00:20:55.987 "nvme_io": false, 00:20:55.987 "nvme_io_md": false, 00:20:55.987 "write_zeroes": true, 00:20:55.987 "zcopy": true, 00:20:55.987 "get_zone_info": false, 00:20:55.987 "zone_management": false, 00:20:55.987 "zone_append": false, 00:20:55.987 "compare": false, 00:20:55.987 "compare_and_write": false, 00:20:55.987 "abort": true, 00:20:55.987 "seek_hole": false, 00:20:55.987 "seek_data": false, 00:20:55.987 "copy": true, 00:20:55.987 "nvme_iov_md": false 00:20:55.987 }, 00:20:55.987 "memory_domains": [ 00:20:55.987 { 00:20:55.987 "dma_device_id": "system", 00:20:55.987 "dma_device_type": 1 00:20:55.987 }, 00:20:55.987 { 00:20:55.987 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:55.987 "dma_device_type": 2 00:20:55.987 } 00:20:55.987 ], 00:20:55.987 "driver_specific": {} 00:20:55.987 }' 00:20:55.987 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:55.987 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:56.247 13:29:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.247 13:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:56.508 [2024-07-25 13:29:37.228688] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:56.508 [2024-07-25 13:29:37.228708] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:56.508 [2024-07-25 13:29:37.228746] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:56.508 [2024-07-25 13:29:37.228957] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:56.508 [2024-07-25 13:29:37.228963] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2790c70 name Existed_Raid, state offline 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 976085 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@950 -- # '[' -z 976085 ']' 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # kill -0 976085 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # uname 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 976085 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 976085' 00:20:56.508 killing process with pid 976085 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@969 -- # kill 976085 00:20:56.508 [2024-07-25 13:29:37.295721] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:56.508 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@974 -- # wait 976085 00:20:56.769 [2024-07-25 13:29:37.316405] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:56.769 00:20:56.769 real 0m29.642s 00:20:56.769 user 0m55.626s 00:20:56.769 sys 0m4.262s 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.769 ************************************ 00:20:56.769 END TEST raid_state_function_test 00:20:56.769 ************************************ 00:20:56.769 13:29:37 bdev_raid -- bdev/bdev_raid.sh@948 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:20:56.769 13:29:37 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:20:56.769 13:29:37 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:20:56.769 13:29:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:56.769 ************************************ 00:20:56.769 START TEST raid_state_function_test_sb 00:20:56.769 ************************************ 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 4 true 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=981525 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 981525' 00:20:56.769 Process raid pid: 981525 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 981525 /var/tmp/spdk-raid.sock 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@831 -- # '[' -z 981525 ']' 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:56.769 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:20:56.769 13:29:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:57.029 [2024-07-25 13:29:37.580172] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:20:57.029 [2024-07-25 13:29:37.580223] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:57.029 [2024-07-25 13:29:37.670183] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:57.029 [2024-07-25 13:29:37.737180] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:20:57.029 [2024-07-25 13:29:37.783105] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:57.029 [2024-07-25 13:29:37.783128] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:57.969 13:29:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:20:57.969 13:29:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@864 -- # return 0 00:20:57.969 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:57.969 [2024-07-25 13:29:38.598087] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:57.969 [2024-07-25 13:29:38.598116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:57.969 [2024-07-25 13:29:38.598123] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:57.969 [2024-07-25 13:29:38.598128] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:57.969 [2024-07-25 13:29:38.598133] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:57.969 [2024-07-25 13:29:38.598138] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:57.969 [2024-07-25 13:29:38.598142] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:57.969 [2024-07-25 13:29:38.598148] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:57.969 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:57.969 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:57.969 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.970 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:58.230 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:58.230 "name": "Existed_Raid", 00:20:58.230 "uuid": "cbe2e99a-82c4-4388-9984-500788fe4e65", 00:20:58.230 "strip_size_kb": 0, 00:20:58.230 "state": "configuring", 00:20:58.230 "raid_level": "raid1", 00:20:58.230 "superblock": true, 00:20:58.230 "num_base_bdevs": 4, 00:20:58.230 "num_base_bdevs_discovered": 0, 00:20:58.230 "num_base_bdevs_operational": 4, 00:20:58.230 "base_bdevs_list": [ 00:20:58.230 { 00:20:58.230 "name": "BaseBdev1", 00:20:58.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.230 "is_configured": false, 00:20:58.230 "data_offset": 0, 00:20:58.230 "data_size": 0 00:20:58.230 }, 00:20:58.230 { 00:20:58.230 "name": "BaseBdev2", 00:20:58.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.230 "is_configured": false, 00:20:58.230 "data_offset": 0, 00:20:58.230 "data_size": 0 00:20:58.230 }, 00:20:58.230 { 00:20:58.230 "name": "BaseBdev3", 00:20:58.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.230 "is_configured": false, 00:20:58.230 "data_offset": 0, 00:20:58.230 "data_size": 0 00:20:58.230 }, 00:20:58.230 { 00:20:58.230 "name": "BaseBdev4", 00:20:58.230 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:58.230 "is_configured": false, 00:20:58.230 "data_offset": 0, 00:20:58.230 "data_size": 0 00:20:58.230 } 00:20:58.230 ] 00:20:58.230 }' 00:20:58.230 13:29:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:58.230 13:29:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:58.835 13:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:58.835 [2024-07-25 13:29:39.504265] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:58.835 [2024-07-25 13:29:39.504288] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd196f0 name Existed_Raid, state configuring 00:20:58.835 13:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:59.110 [2024-07-25 13:29:39.696772] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:59.110 [2024-07-25 13:29:39.696793] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:59.110 [2024-07-25 13:29:39.696798] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:59.110 [2024-07-25 13:29:39.696803] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:59.110 [2024-07-25 13:29:39.696807] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:59.110 [2024-07-25 13:29:39.696812] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:59.110 [2024-07-25 13:29:39.696817] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:59.110 [2024-07-25 13:29:39.696822] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:59.110 13:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:59.110 [2024-07-25 13:29:39.895939] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:59.110 BaseBdev1 00:20:59.371 13:29:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:59.371 13:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:20:59.371 13:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:20:59.371 13:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:20:59.371 13:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:20:59.371 13:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:20:59.371 13:29:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:59.371 13:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:59.631 [ 00:20:59.631 { 00:20:59.631 "name": "BaseBdev1", 00:20:59.631 "aliases": [ 00:20:59.631 "09f5bb50-318d-4fb2-aef5-161880bc3f48" 00:20:59.631 ], 00:20:59.631 "product_name": "Malloc disk", 00:20:59.631 "block_size": 512, 00:20:59.631 "num_blocks": 65536, 00:20:59.631 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:20:59.631 "assigned_rate_limits": { 00:20:59.631 "rw_ios_per_sec": 0, 00:20:59.631 "rw_mbytes_per_sec": 0, 00:20:59.631 "r_mbytes_per_sec": 0, 00:20:59.631 "w_mbytes_per_sec": 0 00:20:59.631 }, 00:20:59.631 "claimed": true, 00:20:59.631 "claim_type": "exclusive_write", 00:20:59.631 "zoned": false, 00:20:59.631 "supported_io_types": { 00:20:59.631 "read": true, 00:20:59.631 "write": true, 00:20:59.631 "unmap": true, 00:20:59.631 "flush": true, 00:20:59.631 "reset": true, 00:20:59.631 "nvme_admin": false, 00:20:59.631 "nvme_io": false, 00:20:59.631 "nvme_io_md": false, 00:20:59.631 "write_zeroes": true, 00:20:59.631 "zcopy": true, 00:20:59.631 "get_zone_info": false, 00:20:59.631 "zone_management": false, 00:20:59.631 "zone_append": false, 00:20:59.631 "compare": false, 00:20:59.631 "compare_and_write": false, 00:20:59.631 "abort": true, 00:20:59.631 "seek_hole": false, 00:20:59.631 "seek_data": false, 00:20:59.631 "copy": true, 00:20:59.631 "nvme_iov_md": false 00:20:59.631 }, 00:20:59.631 "memory_domains": [ 00:20:59.631 { 00:20:59.631 "dma_device_id": "system", 00:20:59.631 "dma_device_type": 1 00:20:59.631 }, 00:20:59.631 { 00:20:59.631 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.631 "dma_device_type": 2 00:20:59.631 } 00:20:59.631 ], 00:20:59.631 "driver_specific": {} 00:20:59.631 } 00:20:59.631 ] 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:59.631 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:59.891 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:59.891 "name": "Existed_Raid", 00:20:59.891 "uuid": "53cfde4c-e865-4d70-84c7-c639053dc234", 00:20:59.891 "strip_size_kb": 0, 00:20:59.891 "state": "configuring", 00:20:59.891 "raid_level": "raid1", 00:20:59.891 "superblock": true, 00:20:59.891 "num_base_bdevs": 4, 00:20:59.891 "num_base_bdevs_discovered": 1, 00:20:59.891 "num_base_bdevs_operational": 4, 00:20:59.891 "base_bdevs_list": [ 00:20:59.891 { 00:20:59.891 "name": "BaseBdev1", 00:20:59.891 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:20:59.891 "is_configured": true, 00:20:59.891 "data_offset": 2048, 00:20:59.891 "data_size": 63488 00:20:59.891 }, 00:20:59.891 { 00:20:59.891 "name": "BaseBdev2", 00:20:59.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.891 "is_configured": false, 00:20:59.891 "data_offset": 0, 00:20:59.891 "data_size": 0 00:20:59.891 }, 00:20:59.891 { 00:20:59.891 "name": "BaseBdev3", 00:20:59.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.891 "is_configured": false, 00:20:59.891 "data_offset": 0, 00:20:59.891 "data_size": 0 00:20:59.891 }, 00:20:59.891 { 00:20:59.891 "name": "BaseBdev4", 00:20:59.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:59.891 "is_configured": false, 00:20:59.891 "data_offset": 0, 00:20:59.891 "data_size": 0 00:20:59.891 } 00:20:59.891 ] 00:20:59.891 }' 00:20:59.891 13:29:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:59.891 13:29:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:00.461 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:00.461 [2024-07-25 13:29:41.183287] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:00.461 [2024-07-25 13:29:41.183313] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd18f60 name Existed_Raid, state configuring 00:21:00.461 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:00.721 [2024-07-25 13:29:41.359771] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:00.721 [2024-07-25 13:29:41.360953] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:21:00.721 [2024-07-25 13:29:41.360977] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:21:00.721 [2024-07-25 13:29:41.360983] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:21:00.721 [2024-07-25 13:29:41.360988] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:21:00.721 [2024-07-25 13:29:41.360993] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:21:00.721 [2024-07-25 13:29:41.360998] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:00.721 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:00.982 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:00.982 "name": "Existed_Raid", 00:21:00.982 "uuid": "9cb28f9d-8eea-4250-9c9b-4950f236ab0a", 00:21:00.982 "strip_size_kb": 0, 00:21:00.982 "state": "configuring", 00:21:00.982 "raid_level": "raid1", 00:21:00.982 "superblock": true, 00:21:00.982 "num_base_bdevs": 4, 00:21:00.982 "num_base_bdevs_discovered": 1, 00:21:00.982 "num_base_bdevs_operational": 4, 00:21:00.982 "base_bdevs_list": [ 00:21:00.982 { 00:21:00.982 "name": "BaseBdev1", 00:21:00.982 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:21:00.982 "is_configured": true, 00:21:00.982 "data_offset": 2048, 00:21:00.982 "data_size": 63488 00:21:00.982 }, 00:21:00.982 { 00:21:00.982 "name": "BaseBdev2", 00:21:00.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.982 "is_configured": false, 00:21:00.982 "data_offset": 0, 00:21:00.982 "data_size": 0 00:21:00.982 }, 00:21:00.982 { 00:21:00.982 "name": "BaseBdev3", 00:21:00.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.982 "is_configured": false, 00:21:00.982 "data_offset": 0, 00:21:00.982 "data_size": 0 00:21:00.982 }, 00:21:00.982 { 00:21:00.982 "name": "BaseBdev4", 00:21:00.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:00.982 "is_configured": false, 00:21:00.982 "data_offset": 0, 00:21:00.982 "data_size": 0 00:21:00.982 } 00:21:00.982 ] 00:21:00.982 }' 00:21:00.982 13:29:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:00.982 13:29:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:01.550 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:01.550 [2024-07-25 13:29:42.262980] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:01.550 BaseBdev2 00:21:01.551 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:21:01.551 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:01.551 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:01.551 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:01.551 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:01.551 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:01.551 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:01.810 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:02.071 [ 00:21:02.071 { 00:21:02.071 "name": "BaseBdev2", 00:21:02.071 "aliases": [ 00:21:02.071 "58175111-f2ac-4133-bdb0-703db1276b0f" 00:21:02.071 ], 00:21:02.071 "product_name": "Malloc disk", 00:21:02.071 "block_size": 512, 00:21:02.071 "num_blocks": 65536, 00:21:02.071 "uuid": "58175111-f2ac-4133-bdb0-703db1276b0f", 00:21:02.071 "assigned_rate_limits": { 00:21:02.071 "rw_ios_per_sec": 0, 00:21:02.071 "rw_mbytes_per_sec": 0, 00:21:02.071 "r_mbytes_per_sec": 0, 00:21:02.071 "w_mbytes_per_sec": 0 00:21:02.071 }, 00:21:02.071 "claimed": true, 00:21:02.071 "claim_type": "exclusive_write", 00:21:02.071 "zoned": false, 00:21:02.071 "supported_io_types": { 00:21:02.071 "read": true, 00:21:02.071 "write": true, 00:21:02.071 "unmap": true, 00:21:02.071 "flush": true, 00:21:02.071 "reset": true, 00:21:02.071 "nvme_admin": false, 00:21:02.071 "nvme_io": false, 00:21:02.071 "nvme_io_md": false, 00:21:02.071 "write_zeroes": true, 00:21:02.071 "zcopy": true, 00:21:02.071 "get_zone_info": false, 00:21:02.071 "zone_management": false, 00:21:02.071 "zone_append": false, 00:21:02.071 "compare": false, 00:21:02.071 "compare_and_write": false, 00:21:02.071 "abort": true, 00:21:02.071 "seek_hole": false, 00:21:02.071 "seek_data": false, 00:21:02.071 "copy": true, 00:21:02.071 "nvme_iov_md": false 00:21:02.071 }, 00:21:02.071 "memory_domains": [ 00:21:02.071 { 00:21:02.071 "dma_device_id": "system", 00:21:02.071 "dma_device_type": 1 00:21:02.071 }, 00:21:02.071 { 00:21:02.071 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:02.071 "dma_device_type": 2 00:21:02.071 } 00:21:02.071 ], 00:21:02.071 "driver_specific": {} 00:21:02.071 } 00:21:02.071 ] 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:02.071 "name": "Existed_Raid", 00:21:02.071 "uuid": "9cb28f9d-8eea-4250-9c9b-4950f236ab0a", 00:21:02.071 "strip_size_kb": 0, 00:21:02.071 "state": "configuring", 00:21:02.071 "raid_level": "raid1", 00:21:02.071 "superblock": true, 00:21:02.071 "num_base_bdevs": 4, 00:21:02.071 "num_base_bdevs_discovered": 2, 00:21:02.071 "num_base_bdevs_operational": 4, 00:21:02.071 "base_bdevs_list": [ 00:21:02.071 { 00:21:02.071 "name": "BaseBdev1", 00:21:02.071 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:21:02.071 "is_configured": true, 00:21:02.071 "data_offset": 2048, 00:21:02.071 "data_size": 63488 00:21:02.071 }, 00:21:02.071 { 00:21:02.071 "name": "BaseBdev2", 00:21:02.071 "uuid": "58175111-f2ac-4133-bdb0-703db1276b0f", 00:21:02.071 "is_configured": true, 00:21:02.071 "data_offset": 2048, 00:21:02.071 "data_size": 63488 00:21:02.071 }, 00:21:02.071 { 00:21:02.071 "name": "BaseBdev3", 00:21:02.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.071 "is_configured": false, 00:21:02.071 "data_offset": 0, 00:21:02.071 "data_size": 0 00:21:02.071 }, 00:21:02.071 { 00:21:02.071 "name": "BaseBdev4", 00:21:02.071 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:02.071 "is_configured": false, 00:21:02.071 "data_offset": 0, 00:21:02.071 "data_size": 0 00:21:02.071 } 00:21:02.071 ] 00:21:02.071 }' 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:02.071 13:29:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:02.642 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:02.901 [2024-07-25 13:29:43.595426] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:02.901 BaseBdev3 00:21:02.901 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:21:02.901 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:21:02.901 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:02.901 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:02.901 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:02.901 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:02.901 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:03.160 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:03.420 [ 00:21:03.420 { 00:21:03.420 "name": "BaseBdev3", 00:21:03.420 "aliases": [ 00:21:03.420 "4a695c5f-e78a-435b-acba-dfafb480e249" 00:21:03.420 ], 00:21:03.420 "product_name": "Malloc disk", 00:21:03.420 "block_size": 512, 00:21:03.420 "num_blocks": 65536, 00:21:03.420 "uuid": "4a695c5f-e78a-435b-acba-dfafb480e249", 00:21:03.421 "assigned_rate_limits": { 00:21:03.421 "rw_ios_per_sec": 0, 00:21:03.421 "rw_mbytes_per_sec": 0, 00:21:03.421 "r_mbytes_per_sec": 0, 00:21:03.421 "w_mbytes_per_sec": 0 00:21:03.421 }, 00:21:03.421 "claimed": true, 00:21:03.421 "claim_type": "exclusive_write", 00:21:03.421 "zoned": false, 00:21:03.421 "supported_io_types": { 00:21:03.421 "read": true, 00:21:03.421 "write": true, 00:21:03.421 "unmap": true, 00:21:03.421 "flush": true, 00:21:03.421 "reset": true, 00:21:03.421 "nvme_admin": false, 00:21:03.421 "nvme_io": false, 00:21:03.421 "nvme_io_md": false, 00:21:03.421 "write_zeroes": true, 00:21:03.421 "zcopy": true, 00:21:03.421 "get_zone_info": false, 00:21:03.421 "zone_management": false, 00:21:03.421 "zone_append": false, 00:21:03.421 "compare": false, 00:21:03.421 "compare_and_write": false, 00:21:03.421 "abort": true, 00:21:03.421 "seek_hole": false, 00:21:03.421 "seek_data": false, 00:21:03.421 "copy": true, 00:21:03.421 "nvme_iov_md": false 00:21:03.421 }, 00:21:03.421 "memory_domains": [ 00:21:03.421 { 00:21:03.421 "dma_device_id": "system", 00:21:03.421 "dma_device_type": 1 00:21:03.421 }, 00:21:03.421 { 00:21:03.421 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:03.421 "dma_device_type": 2 00:21:03.421 } 00:21:03.421 ], 00:21:03.421 "driver_specific": {} 00:21:03.421 } 00:21:03.421 ] 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.421 13:29:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:03.421 13:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.421 "name": "Existed_Raid", 00:21:03.421 "uuid": "9cb28f9d-8eea-4250-9c9b-4950f236ab0a", 00:21:03.421 "strip_size_kb": 0, 00:21:03.421 "state": "configuring", 00:21:03.421 "raid_level": "raid1", 00:21:03.421 "superblock": true, 00:21:03.421 "num_base_bdevs": 4, 00:21:03.421 "num_base_bdevs_discovered": 3, 00:21:03.421 "num_base_bdevs_operational": 4, 00:21:03.421 "base_bdevs_list": [ 00:21:03.421 { 00:21:03.421 "name": "BaseBdev1", 00:21:03.421 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:21:03.421 "is_configured": true, 00:21:03.421 "data_offset": 2048, 00:21:03.421 "data_size": 63488 00:21:03.421 }, 00:21:03.421 { 00:21:03.421 "name": "BaseBdev2", 00:21:03.421 "uuid": "58175111-f2ac-4133-bdb0-703db1276b0f", 00:21:03.421 "is_configured": true, 00:21:03.421 "data_offset": 2048, 00:21:03.421 "data_size": 63488 00:21:03.421 }, 00:21:03.421 { 00:21:03.421 "name": "BaseBdev3", 00:21:03.421 "uuid": "4a695c5f-e78a-435b-acba-dfafb480e249", 00:21:03.421 "is_configured": true, 00:21:03.421 "data_offset": 2048, 00:21:03.421 "data_size": 63488 00:21:03.421 }, 00:21:03.421 { 00:21:03.421 "name": "BaseBdev4", 00:21:03.421 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.421 "is_configured": false, 00:21:03.421 "data_offset": 0, 00:21:03.421 "data_size": 0 00:21:03.421 } 00:21:03.421 ] 00:21:03.421 }' 00:21:03.421 13:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.421 13:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:03.991 13:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:04.252 [2024-07-25 13:29:44.919680] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:04.252 [2024-07-25 13:29:44.919806] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xd19fd0 00:21:04.252 [2024-07-25 13:29:44.919814] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:04.252 [2024-07-25 13:29:44.919951] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xebe8e0 00:21:04.252 [2024-07-25 13:29:44.920049] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd19fd0 00:21:04.252 [2024-07-25 13:29:44.920055] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd19fd0 00:21:04.252 [2024-07-25 13:29:44.920125] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:04.252 BaseBdev4 00:21:04.252 13:29:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:21:04.252 13:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:21:04.252 13:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:04.252 13:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:04.252 13:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:04.252 13:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:04.252 13:29:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:04.512 13:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:04.772 [ 00:21:04.772 { 00:21:04.772 "name": "BaseBdev4", 00:21:04.772 "aliases": [ 00:21:04.772 "c54c8086-95ec-405e-b118-0d1d2f3b693e" 00:21:04.772 ], 00:21:04.772 "product_name": "Malloc disk", 00:21:04.772 "block_size": 512, 00:21:04.772 "num_blocks": 65536, 00:21:04.772 "uuid": "c54c8086-95ec-405e-b118-0d1d2f3b693e", 00:21:04.772 "assigned_rate_limits": { 00:21:04.772 "rw_ios_per_sec": 0, 00:21:04.772 "rw_mbytes_per_sec": 0, 00:21:04.772 "r_mbytes_per_sec": 0, 00:21:04.772 "w_mbytes_per_sec": 0 00:21:04.772 }, 00:21:04.772 "claimed": true, 00:21:04.772 "claim_type": "exclusive_write", 00:21:04.772 "zoned": false, 00:21:04.772 "supported_io_types": { 00:21:04.772 "read": true, 00:21:04.772 "write": true, 00:21:04.772 "unmap": true, 00:21:04.772 "flush": true, 00:21:04.772 "reset": true, 00:21:04.772 "nvme_admin": false, 00:21:04.772 "nvme_io": false, 00:21:04.772 "nvme_io_md": false, 00:21:04.772 "write_zeroes": true, 00:21:04.772 "zcopy": true, 00:21:04.772 "get_zone_info": false, 00:21:04.772 "zone_management": false, 00:21:04.772 "zone_append": false, 00:21:04.772 "compare": false, 00:21:04.772 "compare_and_write": false, 00:21:04.772 "abort": true, 00:21:04.772 "seek_hole": false, 00:21:04.772 "seek_data": false, 00:21:04.772 "copy": true, 00:21:04.772 "nvme_iov_md": false 00:21:04.772 }, 00:21:04.772 "memory_domains": [ 00:21:04.772 { 00:21:04.772 "dma_device_id": "system", 00:21:04.772 "dma_device_type": 1 00:21:04.772 }, 00:21:04.772 { 00:21:04.772 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:04.772 "dma_device_type": 2 00:21:04.772 } 00:21:04.772 ], 00:21:04.772 "driver_specific": {} 00:21:04.772 } 00:21:04.772 ] 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.772 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.773 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.773 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.773 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:04.773 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:05.343 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.343 "name": "Existed_Raid", 00:21:05.343 "uuid": "9cb28f9d-8eea-4250-9c9b-4950f236ab0a", 00:21:05.343 "strip_size_kb": 0, 00:21:05.343 "state": "online", 00:21:05.343 "raid_level": "raid1", 00:21:05.343 "superblock": true, 00:21:05.343 "num_base_bdevs": 4, 00:21:05.343 "num_base_bdevs_discovered": 4, 00:21:05.343 "num_base_bdevs_operational": 4, 00:21:05.343 "base_bdevs_list": [ 00:21:05.343 { 00:21:05.343 "name": "BaseBdev1", 00:21:05.343 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:21:05.343 "is_configured": true, 00:21:05.343 "data_offset": 2048, 00:21:05.343 "data_size": 63488 00:21:05.343 }, 00:21:05.343 { 00:21:05.343 "name": "BaseBdev2", 00:21:05.343 "uuid": "58175111-f2ac-4133-bdb0-703db1276b0f", 00:21:05.343 "is_configured": true, 00:21:05.343 "data_offset": 2048, 00:21:05.343 "data_size": 63488 00:21:05.343 }, 00:21:05.343 { 00:21:05.343 "name": "BaseBdev3", 00:21:05.343 "uuid": "4a695c5f-e78a-435b-acba-dfafb480e249", 00:21:05.343 "is_configured": true, 00:21:05.343 "data_offset": 2048, 00:21:05.343 "data_size": 63488 00:21:05.343 }, 00:21:05.343 { 00:21:05.343 "name": "BaseBdev4", 00:21:05.343 "uuid": "c54c8086-95ec-405e-b118-0d1d2f3b693e", 00:21:05.343 "is_configured": true, 00:21:05.343 "data_offset": 2048, 00:21:05.343 "data_size": 63488 00:21:05.343 } 00:21:05.343 ] 00:21:05.343 }' 00:21:05.343 13:29:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.343 13:29:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:05.912 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:21:05.912 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:05.912 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:05.913 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:05.913 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:05.913 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:05.913 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:05.913 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:05.913 [2024-07-25 13:29:46.648316] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:05.913 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:05.913 "name": "Existed_Raid", 00:21:05.913 "aliases": [ 00:21:05.913 "9cb28f9d-8eea-4250-9c9b-4950f236ab0a" 00:21:05.913 ], 00:21:05.913 "product_name": "Raid Volume", 00:21:05.913 "block_size": 512, 00:21:05.913 "num_blocks": 63488, 00:21:05.913 "uuid": "9cb28f9d-8eea-4250-9c9b-4950f236ab0a", 00:21:05.913 "assigned_rate_limits": { 00:21:05.913 "rw_ios_per_sec": 0, 00:21:05.913 "rw_mbytes_per_sec": 0, 00:21:05.913 "r_mbytes_per_sec": 0, 00:21:05.913 "w_mbytes_per_sec": 0 00:21:05.913 }, 00:21:05.913 "claimed": false, 00:21:05.913 "zoned": false, 00:21:05.913 "supported_io_types": { 00:21:05.913 "read": true, 00:21:05.913 "write": true, 00:21:05.913 "unmap": false, 00:21:05.913 "flush": false, 00:21:05.913 "reset": true, 00:21:05.913 "nvme_admin": false, 00:21:05.913 "nvme_io": false, 00:21:05.913 "nvme_io_md": false, 00:21:05.913 "write_zeroes": true, 00:21:05.913 "zcopy": false, 00:21:05.913 "get_zone_info": false, 00:21:05.913 "zone_management": false, 00:21:05.913 "zone_append": false, 00:21:05.913 "compare": false, 00:21:05.913 "compare_and_write": false, 00:21:05.913 "abort": false, 00:21:05.913 "seek_hole": false, 00:21:05.913 "seek_data": false, 00:21:05.913 "copy": false, 00:21:05.913 "nvme_iov_md": false 00:21:05.913 }, 00:21:05.913 "memory_domains": [ 00:21:05.913 { 00:21:05.913 "dma_device_id": "system", 00:21:05.913 "dma_device_type": 1 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.913 "dma_device_type": 2 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "dma_device_id": "system", 00:21:05.913 "dma_device_type": 1 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.913 "dma_device_type": 2 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "dma_device_id": "system", 00:21:05.913 "dma_device_type": 1 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.913 "dma_device_type": 2 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "dma_device_id": "system", 00:21:05.913 "dma_device_type": 1 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:05.913 "dma_device_type": 2 00:21:05.913 } 00:21:05.913 ], 00:21:05.913 "driver_specific": { 00:21:05.913 "raid": { 00:21:05.913 "uuid": "9cb28f9d-8eea-4250-9c9b-4950f236ab0a", 00:21:05.913 "strip_size_kb": 0, 00:21:05.913 "state": "online", 00:21:05.913 "raid_level": "raid1", 00:21:05.913 "superblock": true, 00:21:05.913 "num_base_bdevs": 4, 00:21:05.913 "num_base_bdevs_discovered": 4, 00:21:05.913 "num_base_bdevs_operational": 4, 00:21:05.913 "base_bdevs_list": [ 00:21:05.913 { 00:21:05.913 "name": "BaseBdev1", 00:21:05.913 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:21:05.913 "is_configured": true, 00:21:05.913 "data_offset": 2048, 00:21:05.913 "data_size": 63488 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "name": "BaseBdev2", 00:21:05.913 "uuid": "58175111-f2ac-4133-bdb0-703db1276b0f", 00:21:05.913 "is_configured": true, 00:21:05.913 "data_offset": 2048, 00:21:05.913 "data_size": 63488 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "name": "BaseBdev3", 00:21:05.913 "uuid": "4a695c5f-e78a-435b-acba-dfafb480e249", 00:21:05.913 "is_configured": true, 00:21:05.913 "data_offset": 2048, 00:21:05.913 "data_size": 63488 00:21:05.913 }, 00:21:05.913 { 00:21:05.913 "name": "BaseBdev4", 00:21:05.913 "uuid": "c54c8086-95ec-405e-b118-0d1d2f3b693e", 00:21:05.913 "is_configured": true, 00:21:05.913 "data_offset": 2048, 00:21:05.913 "data_size": 63488 00:21:05.913 } 00:21:05.913 ] 00:21:05.913 } 00:21:05.913 } 00:21:05.913 }' 00:21:05.913 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:06.173 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:21:06.173 BaseBdev2 00:21:06.173 BaseBdev3 00:21:06.173 BaseBdev4' 00:21:06.173 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.173 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:21:06.173 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:06.173 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:06.173 "name": "BaseBdev1", 00:21:06.173 "aliases": [ 00:21:06.173 "09f5bb50-318d-4fb2-aef5-161880bc3f48" 00:21:06.173 ], 00:21:06.173 "product_name": "Malloc disk", 00:21:06.173 "block_size": 512, 00:21:06.173 "num_blocks": 65536, 00:21:06.173 "uuid": "09f5bb50-318d-4fb2-aef5-161880bc3f48", 00:21:06.173 "assigned_rate_limits": { 00:21:06.173 "rw_ios_per_sec": 0, 00:21:06.173 "rw_mbytes_per_sec": 0, 00:21:06.173 "r_mbytes_per_sec": 0, 00:21:06.173 "w_mbytes_per_sec": 0 00:21:06.173 }, 00:21:06.173 "claimed": true, 00:21:06.173 "claim_type": "exclusive_write", 00:21:06.173 "zoned": false, 00:21:06.173 "supported_io_types": { 00:21:06.173 "read": true, 00:21:06.173 "write": true, 00:21:06.173 "unmap": true, 00:21:06.173 "flush": true, 00:21:06.173 "reset": true, 00:21:06.173 "nvme_admin": false, 00:21:06.173 "nvme_io": false, 00:21:06.173 "nvme_io_md": false, 00:21:06.173 "write_zeroes": true, 00:21:06.173 "zcopy": true, 00:21:06.173 "get_zone_info": false, 00:21:06.173 "zone_management": false, 00:21:06.173 "zone_append": false, 00:21:06.173 "compare": false, 00:21:06.173 "compare_and_write": false, 00:21:06.173 "abort": true, 00:21:06.173 "seek_hole": false, 00:21:06.173 "seek_data": false, 00:21:06.173 "copy": true, 00:21:06.173 "nvme_iov_md": false 00:21:06.173 }, 00:21:06.173 "memory_domains": [ 00:21:06.173 { 00:21:06.173 "dma_device_id": "system", 00:21:06.173 "dma_device_type": 1 00:21:06.173 }, 00:21:06.173 { 00:21:06.173 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.173 "dma_device_type": 2 00:21:06.173 } 00:21:06.173 ], 00:21:06.173 "driver_specific": {} 00:21:06.173 }' 00:21:06.173 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.173 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.433 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:06.433 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.433 13:29:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.433 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.433 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.433 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.434 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.434 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.434 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.434 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:06.434 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.434 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:06.434 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:06.694 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:06.694 "name": "BaseBdev2", 00:21:06.694 "aliases": [ 00:21:06.694 "58175111-f2ac-4133-bdb0-703db1276b0f" 00:21:06.694 ], 00:21:06.694 "product_name": "Malloc disk", 00:21:06.694 "block_size": 512, 00:21:06.694 "num_blocks": 65536, 00:21:06.694 "uuid": "58175111-f2ac-4133-bdb0-703db1276b0f", 00:21:06.694 "assigned_rate_limits": { 00:21:06.694 "rw_ios_per_sec": 0, 00:21:06.694 "rw_mbytes_per_sec": 0, 00:21:06.694 "r_mbytes_per_sec": 0, 00:21:06.694 "w_mbytes_per_sec": 0 00:21:06.694 }, 00:21:06.694 "claimed": true, 00:21:06.694 "claim_type": "exclusive_write", 00:21:06.694 "zoned": false, 00:21:06.694 "supported_io_types": { 00:21:06.694 "read": true, 00:21:06.694 "write": true, 00:21:06.694 "unmap": true, 00:21:06.694 "flush": true, 00:21:06.694 "reset": true, 00:21:06.694 "nvme_admin": false, 00:21:06.694 "nvme_io": false, 00:21:06.694 "nvme_io_md": false, 00:21:06.694 "write_zeroes": true, 00:21:06.694 "zcopy": true, 00:21:06.694 "get_zone_info": false, 00:21:06.694 "zone_management": false, 00:21:06.694 "zone_append": false, 00:21:06.694 "compare": false, 00:21:06.694 "compare_and_write": false, 00:21:06.694 "abort": true, 00:21:06.694 "seek_hole": false, 00:21:06.694 "seek_data": false, 00:21:06.694 "copy": true, 00:21:06.694 "nvme_iov_md": false 00:21:06.694 }, 00:21:06.694 "memory_domains": [ 00:21:06.694 { 00:21:06.694 "dma_device_id": "system", 00:21:06.694 "dma_device_type": 1 00:21:06.694 }, 00:21:06.694 { 00:21:06.694 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:06.694 "dma_device_type": 2 00:21:06.694 } 00:21:06.694 ], 00:21:06.694 "driver_specific": {} 00:21:06.694 }' 00:21:06.694 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.694 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:06.694 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:06.694 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:06.954 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:07.214 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:07.214 "name": "BaseBdev3", 00:21:07.214 "aliases": [ 00:21:07.214 "4a695c5f-e78a-435b-acba-dfafb480e249" 00:21:07.214 ], 00:21:07.214 "product_name": "Malloc disk", 00:21:07.214 "block_size": 512, 00:21:07.214 "num_blocks": 65536, 00:21:07.214 "uuid": "4a695c5f-e78a-435b-acba-dfafb480e249", 00:21:07.214 "assigned_rate_limits": { 00:21:07.214 "rw_ios_per_sec": 0, 00:21:07.214 "rw_mbytes_per_sec": 0, 00:21:07.214 "r_mbytes_per_sec": 0, 00:21:07.214 "w_mbytes_per_sec": 0 00:21:07.214 }, 00:21:07.214 "claimed": true, 00:21:07.214 "claim_type": "exclusive_write", 00:21:07.214 "zoned": false, 00:21:07.214 "supported_io_types": { 00:21:07.214 "read": true, 00:21:07.214 "write": true, 00:21:07.214 "unmap": true, 00:21:07.214 "flush": true, 00:21:07.214 "reset": true, 00:21:07.214 "nvme_admin": false, 00:21:07.214 "nvme_io": false, 00:21:07.214 "nvme_io_md": false, 00:21:07.214 "write_zeroes": true, 00:21:07.214 "zcopy": true, 00:21:07.214 "get_zone_info": false, 00:21:07.214 "zone_management": false, 00:21:07.214 "zone_append": false, 00:21:07.214 "compare": false, 00:21:07.214 "compare_and_write": false, 00:21:07.214 "abort": true, 00:21:07.214 "seek_hole": false, 00:21:07.214 "seek_data": false, 00:21:07.214 "copy": true, 00:21:07.214 "nvme_iov_md": false 00:21:07.214 }, 00:21:07.214 "memory_domains": [ 00:21:07.214 { 00:21:07.214 "dma_device_id": "system", 00:21:07.214 "dma_device_type": 1 00:21:07.214 }, 00:21:07.214 { 00:21:07.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.214 "dma_device_type": 2 00:21:07.214 } 00:21:07.214 ], 00:21:07.214 "driver_specific": {} 00:21:07.214 }' 00:21:07.214 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.214 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.214 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:07.214 13:29:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:07.475 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:07.735 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:07.735 "name": "BaseBdev4", 00:21:07.735 "aliases": [ 00:21:07.735 "c54c8086-95ec-405e-b118-0d1d2f3b693e" 00:21:07.735 ], 00:21:07.735 "product_name": "Malloc disk", 00:21:07.735 "block_size": 512, 00:21:07.735 "num_blocks": 65536, 00:21:07.735 "uuid": "c54c8086-95ec-405e-b118-0d1d2f3b693e", 00:21:07.735 "assigned_rate_limits": { 00:21:07.735 "rw_ios_per_sec": 0, 00:21:07.735 "rw_mbytes_per_sec": 0, 00:21:07.735 "r_mbytes_per_sec": 0, 00:21:07.735 "w_mbytes_per_sec": 0 00:21:07.735 }, 00:21:07.735 "claimed": true, 00:21:07.735 "claim_type": "exclusive_write", 00:21:07.735 "zoned": false, 00:21:07.735 "supported_io_types": { 00:21:07.735 "read": true, 00:21:07.735 "write": true, 00:21:07.735 "unmap": true, 00:21:07.735 "flush": true, 00:21:07.735 "reset": true, 00:21:07.735 "nvme_admin": false, 00:21:07.735 "nvme_io": false, 00:21:07.735 "nvme_io_md": false, 00:21:07.735 "write_zeroes": true, 00:21:07.735 "zcopy": true, 00:21:07.735 "get_zone_info": false, 00:21:07.735 "zone_management": false, 00:21:07.735 "zone_append": false, 00:21:07.735 "compare": false, 00:21:07.735 "compare_and_write": false, 00:21:07.735 "abort": true, 00:21:07.735 "seek_hole": false, 00:21:07.735 "seek_data": false, 00:21:07.735 "copy": true, 00:21:07.735 "nvme_iov_md": false 00:21:07.735 }, 00:21:07.735 "memory_domains": [ 00:21:07.735 { 00:21:07.735 "dma_device_id": "system", 00:21:07.735 "dma_device_type": 1 00:21:07.735 }, 00:21:07.735 { 00:21:07.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:07.735 "dma_device_type": 2 00:21:07.735 } 00:21:07.735 ], 00:21:07.735 "driver_specific": {} 00:21:07.735 }' 00:21:07.735 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.735 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:07.995 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:08.257 [2024-07-25 13:29:48.953920] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:08.257 13:29:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:08.517 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:08.517 "name": "Existed_Raid", 00:21:08.517 "uuid": "9cb28f9d-8eea-4250-9c9b-4950f236ab0a", 00:21:08.517 "strip_size_kb": 0, 00:21:08.517 "state": "online", 00:21:08.517 "raid_level": "raid1", 00:21:08.517 "superblock": true, 00:21:08.517 "num_base_bdevs": 4, 00:21:08.517 "num_base_bdevs_discovered": 3, 00:21:08.517 "num_base_bdevs_operational": 3, 00:21:08.517 "base_bdevs_list": [ 00:21:08.517 { 00:21:08.517 "name": null, 00:21:08.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:08.517 "is_configured": false, 00:21:08.517 "data_offset": 2048, 00:21:08.517 "data_size": 63488 00:21:08.517 }, 00:21:08.517 { 00:21:08.517 "name": "BaseBdev2", 00:21:08.517 "uuid": "58175111-f2ac-4133-bdb0-703db1276b0f", 00:21:08.517 "is_configured": true, 00:21:08.517 "data_offset": 2048, 00:21:08.517 "data_size": 63488 00:21:08.517 }, 00:21:08.517 { 00:21:08.517 "name": "BaseBdev3", 00:21:08.517 "uuid": "4a695c5f-e78a-435b-acba-dfafb480e249", 00:21:08.517 "is_configured": true, 00:21:08.517 "data_offset": 2048, 00:21:08.517 "data_size": 63488 00:21:08.517 }, 00:21:08.517 { 00:21:08.517 "name": "BaseBdev4", 00:21:08.518 "uuid": "c54c8086-95ec-405e-b118-0d1d2f3b693e", 00:21:08.518 "is_configured": true, 00:21:08.518 "data_offset": 2048, 00:21:08.518 "data_size": 63488 00:21:08.518 } 00:21:08.518 ] 00:21:08.518 }' 00:21:08.518 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:08.518 13:29:49 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:09.089 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:21:09.089 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:09.089 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.089 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:09.349 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:09.349 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:09.349 13:29:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:21:09.349 [2024-07-25 13:29:50.064744] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:09.349 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:09.349 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:09.349 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.349 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:09.609 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:09.609 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:09.609 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:21:09.869 [2024-07-25 13:29:50.451581] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:09.869 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:09.869 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:09.869 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.869 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:21:10.129 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:21:10.129 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:21:10.129 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:21:10.129 [2024-07-25 13:29:50.838334] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:21:10.129 [2024-07-25 13:29:50.838394] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:10.129 [2024-07-25 13:29:50.844382] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:10.129 [2024-07-25 13:29:50.844406] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:10.129 [2024-07-25 13:29:50.844412] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd19fd0 name Existed_Raid, state offline 00:21:10.129 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:21:10.129 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:21:10.129 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:10.129 13:29:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:21:10.389 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:21:10.389 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:21:10.389 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:21:10.389 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:21:10.389 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:10.390 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:21:10.649 BaseBdev2 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:10.649 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:21:10.909 [ 00:21:10.909 { 00:21:10.909 "name": "BaseBdev2", 00:21:10.909 "aliases": [ 00:21:10.909 "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc" 00:21:10.909 ], 00:21:10.909 "product_name": "Malloc disk", 00:21:10.909 "block_size": 512, 00:21:10.909 "num_blocks": 65536, 00:21:10.909 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:10.909 "assigned_rate_limits": { 00:21:10.909 "rw_ios_per_sec": 0, 00:21:10.909 "rw_mbytes_per_sec": 0, 00:21:10.909 "r_mbytes_per_sec": 0, 00:21:10.909 "w_mbytes_per_sec": 0 00:21:10.909 }, 00:21:10.909 "claimed": false, 00:21:10.909 "zoned": false, 00:21:10.909 "supported_io_types": { 00:21:10.909 "read": true, 00:21:10.909 "write": true, 00:21:10.909 "unmap": true, 00:21:10.909 "flush": true, 00:21:10.909 "reset": true, 00:21:10.909 "nvme_admin": false, 00:21:10.909 "nvme_io": false, 00:21:10.909 "nvme_io_md": false, 00:21:10.909 "write_zeroes": true, 00:21:10.909 "zcopy": true, 00:21:10.909 "get_zone_info": false, 00:21:10.909 "zone_management": false, 00:21:10.909 "zone_append": false, 00:21:10.909 "compare": false, 00:21:10.909 "compare_and_write": false, 00:21:10.909 "abort": true, 00:21:10.909 "seek_hole": false, 00:21:10.909 "seek_data": false, 00:21:10.909 "copy": true, 00:21:10.909 "nvme_iov_md": false 00:21:10.909 }, 00:21:10.909 "memory_domains": [ 00:21:10.909 { 00:21:10.909 "dma_device_id": "system", 00:21:10.909 "dma_device_type": 1 00:21:10.909 }, 00:21:10.909 { 00:21:10.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:10.909 "dma_device_type": 2 00:21:10.909 } 00:21:10.909 ], 00:21:10.909 "driver_specific": {} 00:21:10.909 } 00:21:10.909 ] 00:21:10.909 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:10.909 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:10.909 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:10.909 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:21:11.169 BaseBdev3 00:21:11.169 13:29:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:21:11.169 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev3 00:21:11.169 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:11.169 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:11.169 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:11.170 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:11.170 13:29:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:11.430 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:21:11.430 [ 00:21:11.430 { 00:21:11.430 "name": "BaseBdev3", 00:21:11.430 "aliases": [ 00:21:11.430 "02bcd612-d3da-47b4-8173-8f61ba439188" 00:21:11.430 ], 00:21:11.430 "product_name": "Malloc disk", 00:21:11.430 "block_size": 512, 00:21:11.430 "num_blocks": 65536, 00:21:11.430 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:11.430 "assigned_rate_limits": { 00:21:11.430 "rw_ios_per_sec": 0, 00:21:11.430 "rw_mbytes_per_sec": 0, 00:21:11.430 "r_mbytes_per_sec": 0, 00:21:11.430 "w_mbytes_per_sec": 0 00:21:11.430 }, 00:21:11.430 "claimed": false, 00:21:11.430 "zoned": false, 00:21:11.430 "supported_io_types": { 00:21:11.430 "read": true, 00:21:11.430 "write": true, 00:21:11.430 "unmap": true, 00:21:11.430 "flush": true, 00:21:11.430 "reset": true, 00:21:11.430 "nvme_admin": false, 00:21:11.430 "nvme_io": false, 00:21:11.430 "nvme_io_md": false, 00:21:11.430 "write_zeroes": true, 00:21:11.430 "zcopy": true, 00:21:11.430 "get_zone_info": false, 00:21:11.430 "zone_management": false, 00:21:11.430 "zone_append": false, 00:21:11.430 "compare": false, 00:21:11.430 "compare_and_write": false, 00:21:11.430 "abort": true, 00:21:11.430 "seek_hole": false, 00:21:11.430 "seek_data": false, 00:21:11.430 "copy": true, 00:21:11.430 "nvme_iov_md": false 00:21:11.430 }, 00:21:11.430 "memory_domains": [ 00:21:11.430 { 00:21:11.430 "dma_device_id": "system", 00:21:11.430 "dma_device_type": 1 00:21:11.430 }, 00:21:11.430 { 00:21:11.430 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:11.430 "dma_device_type": 2 00:21:11.430 } 00:21:11.430 ], 00:21:11.430 "driver_specific": {} 00:21:11.430 } 00:21:11.430 ] 00:21:11.430 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:11.430 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:11.430 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:11.430 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:21:11.691 BaseBdev4 00:21:11.691 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:21:11.691 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev4 00:21:11.691 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:11.691 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:11.691 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:11.691 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:11.691 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:11.951 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:21:12.211 [ 00:21:12.211 { 00:21:12.211 "name": "BaseBdev4", 00:21:12.211 "aliases": [ 00:21:12.211 "9e19c743-e19e-470d-90e3-7e1f8b6cac10" 00:21:12.211 ], 00:21:12.211 "product_name": "Malloc disk", 00:21:12.211 "block_size": 512, 00:21:12.211 "num_blocks": 65536, 00:21:12.211 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:12.211 "assigned_rate_limits": { 00:21:12.211 "rw_ios_per_sec": 0, 00:21:12.211 "rw_mbytes_per_sec": 0, 00:21:12.211 "r_mbytes_per_sec": 0, 00:21:12.211 "w_mbytes_per_sec": 0 00:21:12.211 }, 00:21:12.211 "claimed": false, 00:21:12.211 "zoned": false, 00:21:12.211 "supported_io_types": { 00:21:12.211 "read": true, 00:21:12.211 "write": true, 00:21:12.211 "unmap": true, 00:21:12.211 "flush": true, 00:21:12.211 "reset": true, 00:21:12.211 "nvme_admin": false, 00:21:12.211 "nvme_io": false, 00:21:12.211 "nvme_io_md": false, 00:21:12.211 "write_zeroes": true, 00:21:12.211 "zcopy": true, 00:21:12.211 "get_zone_info": false, 00:21:12.211 "zone_management": false, 00:21:12.211 "zone_append": false, 00:21:12.211 "compare": false, 00:21:12.211 "compare_and_write": false, 00:21:12.211 "abort": true, 00:21:12.211 "seek_hole": false, 00:21:12.211 "seek_data": false, 00:21:12.211 "copy": true, 00:21:12.211 "nvme_iov_md": false 00:21:12.211 }, 00:21:12.211 "memory_domains": [ 00:21:12.211 { 00:21:12.211 "dma_device_id": "system", 00:21:12.211 "dma_device_type": 1 00:21:12.211 }, 00:21:12.211 { 00:21:12.211 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:12.211 "dma_device_type": 2 00:21:12.211 } 00:21:12.211 ], 00:21:12.211 "driver_specific": {} 00:21:12.211 } 00:21:12.211 ] 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:21:12.211 [2024-07-25 13:29:52.937500] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:21:12.211 [2024-07-25 13:29:52.937528] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:21:12.211 [2024-07-25 13:29:52.937540] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:12.211 [2024-07-25 13:29:52.938568] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:12.211 [2024-07-25 13:29:52.938599] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:12.211 13:29:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:12.470 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:12.470 "name": "Existed_Raid", 00:21:12.470 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:12.470 "strip_size_kb": 0, 00:21:12.470 "state": "configuring", 00:21:12.470 "raid_level": "raid1", 00:21:12.470 "superblock": true, 00:21:12.470 "num_base_bdevs": 4, 00:21:12.470 "num_base_bdevs_discovered": 3, 00:21:12.470 "num_base_bdevs_operational": 4, 00:21:12.470 "base_bdevs_list": [ 00:21:12.470 { 00:21:12.470 "name": "BaseBdev1", 00:21:12.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:12.470 "is_configured": false, 00:21:12.470 "data_offset": 0, 00:21:12.470 "data_size": 0 00:21:12.470 }, 00:21:12.470 { 00:21:12.470 "name": "BaseBdev2", 00:21:12.470 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:12.470 "is_configured": true, 00:21:12.470 "data_offset": 2048, 00:21:12.470 "data_size": 63488 00:21:12.470 }, 00:21:12.470 { 00:21:12.470 "name": "BaseBdev3", 00:21:12.470 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:12.470 "is_configured": true, 00:21:12.470 "data_offset": 2048, 00:21:12.470 "data_size": 63488 00:21:12.470 }, 00:21:12.470 { 00:21:12.470 "name": "BaseBdev4", 00:21:12.470 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:12.470 "is_configured": true, 00:21:12.470 "data_offset": 2048, 00:21:12.470 "data_size": 63488 00:21:12.470 } 00:21:12.470 ] 00:21:12.470 }' 00:21:12.470 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:12.470 13:29:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:13.038 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:21:13.298 [2024-07-25 13:29:53.859921] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.298 13:29:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:13.298 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:13.298 "name": "Existed_Raid", 00:21:13.298 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:13.298 "strip_size_kb": 0, 00:21:13.298 "state": "configuring", 00:21:13.298 "raid_level": "raid1", 00:21:13.298 "superblock": true, 00:21:13.298 "num_base_bdevs": 4, 00:21:13.298 "num_base_bdevs_discovered": 2, 00:21:13.298 "num_base_bdevs_operational": 4, 00:21:13.298 "base_bdevs_list": [ 00:21:13.298 { 00:21:13.298 "name": "BaseBdev1", 00:21:13.298 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:13.298 "is_configured": false, 00:21:13.298 "data_offset": 0, 00:21:13.298 "data_size": 0 00:21:13.298 }, 00:21:13.298 { 00:21:13.298 "name": null, 00:21:13.298 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:13.298 "is_configured": false, 00:21:13.298 "data_offset": 2048, 00:21:13.298 "data_size": 63488 00:21:13.298 }, 00:21:13.298 { 00:21:13.298 "name": "BaseBdev3", 00:21:13.298 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:13.298 "is_configured": true, 00:21:13.298 "data_offset": 2048, 00:21:13.298 "data_size": 63488 00:21:13.298 }, 00:21:13.298 { 00:21:13.298 "name": "BaseBdev4", 00:21:13.298 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:13.298 "is_configured": true, 00:21:13.298 "data_offset": 2048, 00:21:13.298 "data_size": 63488 00:21:13.298 } 00:21:13.298 ] 00:21:13.298 }' 00:21:13.298 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:13.298 13:29:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:13.868 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:13.868 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:14.128 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:21:14.128 13:29:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:21:14.391 [2024-07-25 13:29:54.991787] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:14.391 BaseBdev1 00:21:14.391 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:21:14.391 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:21:14.391 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:14.392 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:14.392 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:14.392 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:14.392 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:14.652 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:21:14.652 [ 00:21:14.652 { 00:21:14.652 "name": "BaseBdev1", 00:21:14.652 "aliases": [ 00:21:14.652 "4af41593-6d31-40de-9a28-3896ecb19b1b" 00:21:14.652 ], 00:21:14.652 "product_name": "Malloc disk", 00:21:14.652 "block_size": 512, 00:21:14.652 "num_blocks": 65536, 00:21:14.652 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:14.652 "assigned_rate_limits": { 00:21:14.652 "rw_ios_per_sec": 0, 00:21:14.652 "rw_mbytes_per_sec": 0, 00:21:14.652 "r_mbytes_per_sec": 0, 00:21:14.652 "w_mbytes_per_sec": 0 00:21:14.652 }, 00:21:14.652 "claimed": true, 00:21:14.652 "claim_type": "exclusive_write", 00:21:14.652 "zoned": false, 00:21:14.652 "supported_io_types": { 00:21:14.652 "read": true, 00:21:14.652 "write": true, 00:21:14.652 "unmap": true, 00:21:14.652 "flush": true, 00:21:14.652 "reset": true, 00:21:14.652 "nvme_admin": false, 00:21:14.652 "nvme_io": false, 00:21:14.652 "nvme_io_md": false, 00:21:14.652 "write_zeroes": true, 00:21:14.652 "zcopy": true, 00:21:14.652 "get_zone_info": false, 00:21:14.652 "zone_management": false, 00:21:14.652 "zone_append": false, 00:21:14.652 "compare": false, 00:21:14.652 "compare_and_write": false, 00:21:14.652 "abort": true, 00:21:14.652 "seek_hole": false, 00:21:14.652 "seek_data": false, 00:21:14.652 "copy": true, 00:21:14.652 "nvme_iov_md": false 00:21:14.652 }, 00:21:14.653 "memory_domains": [ 00:21:14.653 { 00:21:14.653 "dma_device_id": "system", 00:21:14.653 "dma_device_type": 1 00:21:14.653 }, 00:21:14.653 { 00:21:14.653 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:14.653 "dma_device_type": 2 00:21:14.653 } 00:21:14.653 ], 00:21:14.653 "driver_specific": {} 00:21:14.653 } 00:21:14.653 ] 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:14.653 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:14.913 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:14.913 "name": "Existed_Raid", 00:21:14.913 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:14.913 "strip_size_kb": 0, 00:21:14.913 "state": "configuring", 00:21:14.913 "raid_level": "raid1", 00:21:14.913 "superblock": true, 00:21:14.913 "num_base_bdevs": 4, 00:21:14.913 "num_base_bdevs_discovered": 3, 00:21:14.913 "num_base_bdevs_operational": 4, 00:21:14.913 "base_bdevs_list": [ 00:21:14.913 { 00:21:14.913 "name": "BaseBdev1", 00:21:14.913 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:14.913 "is_configured": true, 00:21:14.913 "data_offset": 2048, 00:21:14.913 "data_size": 63488 00:21:14.913 }, 00:21:14.913 { 00:21:14.913 "name": null, 00:21:14.913 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:14.913 "is_configured": false, 00:21:14.913 "data_offset": 2048, 00:21:14.913 "data_size": 63488 00:21:14.913 }, 00:21:14.913 { 00:21:14.913 "name": "BaseBdev3", 00:21:14.913 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:14.913 "is_configured": true, 00:21:14.913 "data_offset": 2048, 00:21:14.913 "data_size": 63488 00:21:14.913 }, 00:21:14.913 { 00:21:14.913 "name": "BaseBdev4", 00:21:14.913 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:14.913 "is_configured": true, 00:21:14.913 "data_offset": 2048, 00:21:14.913 "data_size": 63488 00:21:14.913 } 00:21:14.913 ] 00:21:14.913 }' 00:21:14.913 13:29:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:14.913 13:29:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:15.483 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.483 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:15.743 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:21:15.743 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:21:15.743 [2024-07-25 13:29:56.455492] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:21:15.743 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.744 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:16.005 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:16.005 "name": "Existed_Raid", 00:21:16.005 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:16.005 "strip_size_kb": 0, 00:21:16.005 "state": "configuring", 00:21:16.005 "raid_level": "raid1", 00:21:16.005 "superblock": true, 00:21:16.005 "num_base_bdevs": 4, 00:21:16.005 "num_base_bdevs_discovered": 2, 00:21:16.005 "num_base_bdevs_operational": 4, 00:21:16.005 "base_bdevs_list": [ 00:21:16.005 { 00:21:16.005 "name": "BaseBdev1", 00:21:16.005 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:16.005 "is_configured": true, 00:21:16.005 "data_offset": 2048, 00:21:16.005 "data_size": 63488 00:21:16.005 }, 00:21:16.005 { 00:21:16.005 "name": null, 00:21:16.005 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:16.005 "is_configured": false, 00:21:16.005 "data_offset": 2048, 00:21:16.005 "data_size": 63488 00:21:16.005 }, 00:21:16.005 { 00:21:16.005 "name": null, 00:21:16.005 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:16.005 "is_configured": false, 00:21:16.005 "data_offset": 2048, 00:21:16.005 "data_size": 63488 00:21:16.005 }, 00:21:16.005 { 00:21:16.005 "name": "BaseBdev4", 00:21:16.005 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:16.005 "is_configured": true, 00:21:16.005 "data_offset": 2048, 00:21:16.005 "data_size": 63488 00:21:16.005 } 00:21:16.005 ] 00:21:16.005 }' 00:21:16.005 13:29:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:16.005 13:29:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:16.575 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.575 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:16.834 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:21:16.834 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:21:16.835 [2024-07-25 13:29:57.590391] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:16.835 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:17.094 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.094 "name": "Existed_Raid", 00:21:17.094 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:17.094 "strip_size_kb": 0, 00:21:17.094 "state": "configuring", 00:21:17.094 "raid_level": "raid1", 00:21:17.094 "superblock": true, 00:21:17.094 "num_base_bdevs": 4, 00:21:17.094 "num_base_bdevs_discovered": 3, 00:21:17.094 "num_base_bdevs_operational": 4, 00:21:17.094 "base_bdevs_list": [ 00:21:17.094 { 00:21:17.094 "name": "BaseBdev1", 00:21:17.094 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:17.094 "is_configured": true, 00:21:17.094 "data_offset": 2048, 00:21:17.094 "data_size": 63488 00:21:17.094 }, 00:21:17.094 { 00:21:17.094 "name": null, 00:21:17.094 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:17.094 "is_configured": false, 00:21:17.094 "data_offset": 2048, 00:21:17.094 "data_size": 63488 00:21:17.094 }, 00:21:17.094 { 00:21:17.094 "name": "BaseBdev3", 00:21:17.094 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:17.094 "is_configured": true, 00:21:17.094 "data_offset": 2048, 00:21:17.094 "data_size": 63488 00:21:17.094 }, 00:21:17.094 { 00:21:17.094 "name": "BaseBdev4", 00:21:17.094 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:17.094 "is_configured": true, 00:21:17.094 "data_offset": 2048, 00:21:17.094 "data_size": 63488 00:21:17.094 } 00:21:17.094 ] 00:21:17.094 }' 00:21:17.094 13:29:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.094 13:29:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:17.665 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.665 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:21:17.925 [2024-07-25 13:29:58.697193] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.925 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:18.184 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.184 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:18.184 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:18.184 "name": "Existed_Raid", 00:21:18.184 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:18.184 "strip_size_kb": 0, 00:21:18.184 "state": "configuring", 00:21:18.184 "raid_level": "raid1", 00:21:18.184 "superblock": true, 00:21:18.184 "num_base_bdevs": 4, 00:21:18.184 "num_base_bdevs_discovered": 2, 00:21:18.184 "num_base_bdevs_operational": 4, 00:21:18.184 "base_bdevs_list": [ 00:21:18.184 { 00:21:18.184 "name": null, 00:21:18.184 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:18.184 "is_configured": false, 00:21:18.184 "data_offset": 2048, 00:21:18.184 "data_size": 63488 00:21:18.185 }, 00:21:18.185 { 00:21:18.185 "name": null, 00:21:18.185 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:18.185 "is_configured": false, 00:21:18.185 "data_offset": 2048, 00:21:18.185 "data_size": 63488 00:21:18.185 }, 00:21:18.185 { 00:21:18.185 "name": "BaseBdev3", 00:21:18.185 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:18.185 "is_configured": true, 00:21:18.185 "data_offset": 2048, 00:21:18.185 "data_size": 63488 00:21:18.185 }, 00:21:18.185 { 00:21:18.185 "name": "BaseBdev4", 00:21:18.185 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:18.185 "is_configured": true, 00:21:18.185 "data_offset": 2048, 00:21:18.185 "data_size": 63488 00:21:18.185 } 00:21:18.185 ] 00:21:18.185 }' 00:21:18.185 13:29:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:18.185 13:29:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:18.754 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:18.754 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:21:19.013 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:21:19.013 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:21:19.273 [2024-07-25 13:29:59.809847] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.273 13:29:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:19.273 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:19.273 "name": "Existed_Raid", 00:21:19.273 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:19.273 "strip_size_kb": 0, 00:21:19.273 "state": "configuring", 00:21:19.273 "raid_level": "raid1", 00:21:19.273 "superblock": true, 00:21:19.273 "num_base_bdevs": 4, 00:21:19.273 "num_base_bdevs_discovered": 3, 00:21:19.273 "num_base_bdevs_operational": 4, 00:21:19.273 "base_bdevs_list": [ 00:21:19.273 { 00:21:19.273 "name": null, 00:21:19.273 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:19.273 "is_configured": false, 00:21:19.273 "data_offset": 2048, 00:21:19.273 "data_size": 63488 00:21:19.273 }, 00:21:19.273 { 00:21:19.273 "name": "BaseBdev2", 00:21:19.274 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:19.274 "is_configured": true, 00:21:19.274 "data_offset": 2048, 00:21:19.274 "data_size": 63488 00:21:19.274 }, 00:21:19.274 { 00:21:19.274 "name": "BaseBdev3", 00:21:19.274 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:19.274 "is_configured": true, 00:21:19.274 "data_offset": 2048, 00:21:19.274 "data_size": 63488 00:21:19.274 }, 00:21:19.274 { 00:21:19.274 "name": "BaseBdev4", 00:21:19.274 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:19.274 "is_configured": true, 00:21:19.274 "data_offset": 2048, 00:21:19.274 "data_size": 63488 00:21:19.274 } 00:21:19.274 ] 00:21:19.274 }' 00:21:19.274 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:19.274 13:30:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:19.847 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:19.847 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:21:20.164 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:21:20.164 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.164 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:21:20.164 13:30:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 4af41593-6d31-40de-9a28-3896ecb19b1b 00:21:20.446 [2024-07-25 13:30:01.110158] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:21:20.446 [2024-07-25 13:30:01.110280] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xd185e0 00:21:20.446 [2024-07-25 13:30:01.110288] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:20.446 [2024-07-25 13:30:01.110423] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xebe6f0 00:21:20.446 [2024-07-25 13:30:01.110517] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd185e0 00:21:20.446 [2024-07-25 13:30:01.110522] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd185e0 00:21:20.446 [2024-07-25 13:30:01.110603] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:20.446 NewBaseBdev 00:21:20.446 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:21:20.446 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_name=NewBaseBdev 00:21:20.446 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:21:20.446 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # local i 00:21:20.446 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:21:20.446 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:21:20.447 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:21:20.707 [ 00:21:20.707 { 00:21:20.707 "name": "NewBaseBdev", 00:21:20.707 "aliases": [ 00:21:20.707 "4af41593-6d31-40de-9a28-3896ecb19b1b" 00:21:20.707 ], 00:21:20.707 "product_name": "Malloc disk", 00:21:20.707 "block_size": 512, 00:21:20.707 "num_blocks": 65536, 00:21:20.707 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:20.707 "assigned_rate_limits": { 00:21:20.707 "rw_ios_per_sec": 0, 00:21:20.707 "rw_mbytes_per_sec": 0, 00:21:20.707 "r_mbytes_per_sec": 0, 00:21:20.707 "w_mbytes_per_sec": 0 00:21:20.707 }, 00:21:20.707 "claimed": true, 00:21:20.707 "claim_type": "exclusive_write", 00:21:20.707 "zoned": false, 00:21:20.707 "supported_io_types": { 00:21:20.707 "read": true, 00:21:20.707 "write": true, 00:21:20.707 "unmap": true, 00:21:20.707 "flush": true, 00:21:20.707 "reset": true, 00:21:20.707 "nvme_admin": false, 00:21:20.707 "nvme_io": false, 00:21:20.707 "nvme_io_md": false, 00:21:20.707 "write_zeroes": true, 00:21:20.707 "zcopy": true, 00:21:20.707 "get_zone_info": false, 00:21:20.707 "zone_management": false, 00:21:20.707 "zone_append": false, 00:21:20.707 "compare": false, 00:21:20.707 "compare_and_write": false, 00:21:20.707 "abort": true, 00:21:20.707 "seek_hole": false, 00:21:20.707 "seek_data": false, 00:21:20.707 "copy": true, 00:21:20.707 "nvme_iov_md": false 00:21:20.707 }, 00:21:20.707 "memory_domains": [ 00:21:20.707 { 00:21:20.707 "dma_device_id": "system", 00:21:20.707 "dma_device_type": 1 00:21:20.707 }, 00:21:20.707 { 00:21:20.707 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:20.707 "dma_device_type": 2 00:21:20.707 } 00:21:20.707 ], 00:21:20.707 "driver_specific": {} 00:21:20.707 } 00:21:20.707 ] 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@907 -- # return 0 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:20.707 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:21:20.966 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:20.966 "name": "Existed_Raid", 00:21:20.966 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:20.966 "strip_size_kb": 0, 00:21:20.966 "state": "online", 00:21:20.966 "raid_level": "raid1", 00:21:20.966 "superblock": true, 00:21:20.966 "num_base_bdevs": 4, 00:21:20.966 "num_base_bdevs_discovered": 4, 00:21:20.966 "num_base_bdevs_operational": 4, 00:21:20.967 "base_bdevs_list": [ 00:21:20.967 { 00:21:20.967 "name": "NewBaseBdev", 00:21:20.967 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:20.967 "is_configured": true, 00:21:20.967 "data_offset": 2048, 00:21:20.967 "data_size": 63488 00:21:20.967 }, 00:21:20.967 { 00:21:20.967 "name": "BaseBdev2", 00:21:20.967 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:20.967 "is_configured": true, 00:21:20.967 "data_offset": 2048, 00:21:20.967 "data_size": 63488 00:21:20.967 }, 00:21:20.967 { 00:21:20.967 "name": "BaseBdev3", 00:21:20.967 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:20.967 "is_configured": true, 00:21:20.967 "data_offset": 2048, 00:21:20.967 "data_size": 63488 00:21:20.967 }, 00:21:20.967 { 00:21:20.967 "name": "BaseBdev4", 00:21:20.967 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:20.967 "is_configured": true, 00:21:20.967 "data_offset": 2048, 00:21:20.967 "data_size": 63488 00:21:20.967 } 00:21:20.967 ] 00:21:20.967 }' 00:21:20.967 13:30:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:20.967 13:30:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:21.535 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:21:21.535 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:21:21.535 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:21.535 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:21.535 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:21.536 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:21:21.536 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:21:21.536 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:21.795 [2024-07-25 13:30:02.441809] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:21.795 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:21.795 "name": "Existed_Raid", 00:21:21.795 "aliases": [ 00:21:21.795 "73dd6fed-5072-4934-8ed8-c02eb18e6ed1" 00:21:21.795 ], 00:21:21.795 "product_name": "Raid Volume", 00:21:21.795 "block_size": 512, 00:21:21.795 "num_blocks": 63488, 00:21:21.795 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:21.795 "assigned_rate_limits": { 00:21:21.795 "rw_ios_per_sec": 0, 00:21:21.795 "rw_mbytes_per_sec": 0, 00:21:21.795 "r_mbytes_per_sec": 0, 00:21:21.795 "w_mbytes_per_sec": 0 00:21:21.795 }, 00:21:21.795 "claimed": false, 00:21:21.795 "zoned": false, 00:21:21.795 "supported_io_types": { 00:21:21.795 "read": true, 00:21:21.795 "write": true, 00:21:21.795 "unmap": false, 00:21:21.795 "flush": false, 00:21:21.795 "reset": true, 00:21:21.795 "nvme_admin": false, 00:21:21.795 "nvme_io": false, 00:21:21.795 "nvme_io_md": false, 00:21:21.795 "write_zeroes": true, 00:21:21.795 "zcopy": false, 00:21:21.795 "get_zone_info": false, 00:21:21.795 "zone_management": false, 00:21:21.795 "zone_append": false, 00:21:21.795 "compare": false, 00:21:21.795 "compare_and_write": false, 00:21:21.795 "abort": false, 00:21:21.795 "seek_hole": false, 00:21:21.795 "seek_data": false, 00:21:21.795 "copy": false, 00:21:21.795 "nvme_iov_md": false 00:21:21.795 }, 00:21:21.795 "memory_domains": [ 00:21:21.795 { 00:21:21.795 "dma_device_id": "system", 00:21:21.795 "dma_device_type": 1 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.795 "dma_device_type": 2 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "dma_device_id": "system", 00:21:21.795 "dma_device_type": 1 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.795 "dma_device_type": 2 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "dma_device_id": "system", 00:21:21.795 "dma_device_type": 1 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.795 "dma_device_type": 2 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "dma_device_id": "system", 00:21:21.795 "dma_device_type": 1 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:21.795 "dma_device_type": 2 00:21:21.795 } 00:21:21.795 ], 00:21:21.795 "driver_specific": { 00:21:21.795 "raid": { 00:21:21.795 "uuid": "73dd6fed-5072-4934-8ed8-c02eb18e6ed1", 00:21:21.795 "strip_size_kb": 0, 00:21:21.795 "state": "online", 00:21:21.795 "raid_level": "raid1", 00:21:21.795 "superblock": true, 00:21:21.795 "num_base_bdevs": 4, 00:21:21.795 "num_base_bdevs_discovered": 4, 00:21:21.795 "num_base_bdevs_operational": 4, 00:21:21.795 "base_bdevs_list": [ 00:21:21.795 { 00:21:21.795 "name": "NewBaseBdev", 00:21:21.795 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:21.795 "is_configured": true, 00:21:21.795 "data_offset": 2048, 00:21:21.795 "data_size": 63488 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "name": "BaseBdev2", 00:21:21.795 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:21.795 "is_configured": true, 00:21:21.795 "data_offset": 2048, 00:21:21.795 "data_size": 63488 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "name": "BaseBdev3", 00:21:21.795 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:21.795 "is_configured": true, 00:21:21.795 "data_offset": 2048, 00:21:21.795 "data_size": 63488 00:21:21.795 }, 00:21:21.795 { 00:21:21.795 "name": "BaseBdev4", 00:21:21.795 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:21.795 "is_configured": true, 00:21:21.795 "data_offset": 2048, 00:21:21.795 "data_size": 63488 00:21:21.795 } 00:21:21.795 ] 00:21:21.795 } 00:21:21.795 } 00:21:21.795 }' 00:21:21.795 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:21.795 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:21:21.795 BaseBdev2 00:21:21.795 BaseBdev3 00:21:21.795 BaseBdev4' 00:21:21.795 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:21.795 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:21:21.795 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.054 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.054 "name": "NewBaseBdev", 00:21:22.054 "aliases": [ 00:21:22.054 "4af41593-6d31-40de-9a28-3896ecb19b1b" 00:21:22.054 ], 00:21:22.054 "product_name": "Malloc disk", 00:21:22.054 "block_size": 512, 00:21:22.054 "num_blocks": 65536, 00:21:22.054 "uuid": "4af41593-6d31-40de-9a28-3896ecb19b1b", 00:21:22.054 "assigned_rate_limits": { 00:21:22.054 "rw_ios_per_sec": 0, 00:21:22.054 "rw_mbytes_per_sec": 0, 00:21:22.054 "r_mbytes_per_sec": 0, 00:21:22.054 "w_mbytes_per_sec": 0 00:21:22.054 }, 00:21:22.054 "claimed": true, 00:21:22.054 "claim_type": "exclusive_write", 00:21:22.054 "zoned": false, 00:21:22.054 "supported_io_types": { 00:21:22.054 "read": true, 00:21:22.054 "write": true, 00:21:22.054 "unmap": true, 00:21:22.054 "flush": true, 00:21:22.054 "reset": true, 00:21:22.054 "nvme_admin": false, 00:21:22.054 "nvme_io": false, 00:21:22.054 "nvme_io_md": false, 00:21:22.054 "write_zeroes": true, 00:21:22.054 "zcopy": true, 00:21:22.054 "get_zone_info": false, 00:21:22.054 "zone_management": false, 00:21:22.054 "zone_append": false, 00:21:22.054 "compare": false, 00:21:22.054 "compare_and_write": false, 00:21:22.054 "abort": true, 00:21:22.054 "seek_hole": false, 00:21:22.054 "seek_data": false, 00:21:22.054 "copy": true, 00:21:22.054 "nvme_iov_md": false 00:21:22.054 }, 00:21:22.054 "memory_domains": [ 00:21:22.054 { 00:21:22.054 "dma_device_id": "system", 00:21:22.054 "dma_device_type": 1 00:21:22.054 }, 00:21:22.054 { 00:21:22.054 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.054 "dma_device_type": 2 00:21:22.054 } 00:21:22.054 ], 00:21:22.054 "driver_specific": {} 00:21:22.054 }' 00:21:22.054 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.054 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.054 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.054 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.054 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.314 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.314 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.314 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.314 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:22.314 13:30:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.314 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:22.314 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:22.314 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:22.314 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:21:22.314 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:22.574 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:22.574 "name": "BaseBdev2", 00:21:22.574 "aliases": [ 00:21:22.574 "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc" 00:21:22.574 ], 00:21:22.574 "product_name": "Malloc disk", 00:21:22.574 "block_size": 512, 00:21:22.574 "num_blocks": 65536, 00:21:22.574 "uuid": "d99e1c1d-fe01-4f9c-b3a9-b53aec2a5dfc", 00:21:22.574 "assigned_rate_limits": { 00:21:22.574 "rw_ios_per_sec": 0, 00:21:22.574 "rw_mbytes_per_sec": 0, 00:21:22.574 "r_mbytes_per_sec": 0, 00:21:22.574 "w_mbytes_per_sec": 0 00:21:22.574 }, 00:21:22.574 "claimed": true, 00:21:22.574 "claim_type": "exclusive_write", 00:21:22.574 "zoned": false, 00:21:22.574 "supported_io_types": { 00:21:22.574 "read": true, 00:21:22.574 "write": true, 00:21:22.574 "unmap": true, 00:21:22.574 "flush": true, 00:21:22.574 "reset": true, 00:21:22.574 "nvme_admin": false, 00:21:22.574 "nvme_io": false, 00:21:22.574 "nvme_io_md": false, 00:21:22.574 "write_zeroes": true, 00:21:22.574 "zcopy": true, 00:21:22.574 "get_zone_info": false, 00:21:22.574 "zone_management": false, 00:21:22.574 "zone_append": false, 00:21:22.574 "compare": false, 00:21:22.574 "compare_and_write": false, 00:21:22.574 "abort": true, 00:21:22.574 "seek_hole": false, 00:21:22.574 "seek_data": false, 00:21:22.574 "copy": true, 00:21:22.574 "nvme_iov_md": false 00:21:22.574 }, 00:21:22.574 "memory_domains": [ 00:21:22.574 { 00:21:22.574 "dma_device_id": "system", 00:21:22.574 "dma_device_type": 1 00:21:22.574 }, 00:21:22.574 { 00:21:22.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:22.574 "dma_device_type": 2 00:21:22.574 } 00:21:22.574 ], 00:21:22.574 "driver_specific": {} 00:21:22.574 }' 00:21:22.574 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.574 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:22.833 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:22.833 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.833 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:22.833 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:22.833 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:22.833 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.092 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:23.092 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.092 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.092 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.092 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:23.092 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:21:23.092 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:23.352 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:23.352 "name": "BaseBdev3", 00:21:23.352 "aliases": [ 00:21:23.352 "02bcd612-d3da-47b4-8173-8f61ba439188" 00:21:23.352 ], 00:21:23.352 "product_name": "Malloc disk", 00:21:23.352 "block_size": 512, 00:21:23.352 "num_blocks": 65536, 00:21:23.352 "uuid": "02bcd612-d3da-47b4-8173-8f61ba439188", 00:21:23.352 "assigned_rate_limits": { 00:21:23.352 "rw_ios_per_sec": 0, 00:21:23.352 "rw_mbytes_per_sec": 0, 00:21:23.352 "r_mbytes_per_sec": 0, 00:21:23.352 "w_mbytes_per_sec": 0 00:21:23.352 }, 00:21:23.352 "claimed": true, 00:21:23.352 "claim_type": "exclusive_write", 00:21:23.352 "zoned": false, 00:21:23.352 "supported_io_types": { 00:21:23.352 "read": true, 00:21:23.352 "write": true, 00:21:23.352 "unmap": true, 00:21:23.352 "flush": true, 00:21:23.352 "reset": true, 00:21:23.352 "nvme_admin": false, 00:21:23.352 "nvme_io": false, 00:21:23.352 "nvme_io_md": false, 00:21:23.352 "write_zeroes": true, 00:21:23.352 "zcopy": true, 00:21:23.352 "get_zone_info": false, 00:21:23.352 "zone_management": false, 00:21:23.352 "zone_append": false, 00:21:23.352 "compare": false, 00:21:23.352 "compare_and_write": false, 00:21:23.353 "abort": true, 00:21:23.353 "seek_hole": false, 00:21:23.353 "seek_data": false, 00:21:23.353 "copy": true, 00:21:23.353 "nvme_iov_md": false 00:21:23.353 }, 00:21:23.353 "memory_domains": [ 00:21:23.353 { 00:21:23.353 "dma_device_id": "system", 00:21:23.353 "dma_device_type": 1 00:21:23.353 }, 00:21:23.353 { 00:21:23.353 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:23.353 "dma_device_type": 2 00:21:23.353 } 00:21:23.353 ], 00:21:23.353 "driver_specific": {} 00:21:23.353 }' 00:21:23.353 13:30:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.353 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:23.353 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:23.353 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.613 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:23.613 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:23.613 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.613 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:23.932 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:23.932 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.932 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:23.932 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:23.932 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:23.932 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:21:23.932 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:24.192 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:24.192 "name": "BaseBdev4", 00:21:24.192 "aliases": [ 00:21:24.192 "9e19c743-e19e-470d-90e3-7e1f8b6cac10" 00:21:24.192 ], 00:21:24.192 "product_name": "Malloc disk", 00:21:24.192 "block_size": 512, 00:21:24.192 "num_blocks": 65536, 00:21:24.192 "uuid": "9e19c743-e19e-470d-90e3-7e1f8b6cac10", 00:21:24.192 "assigned_rate_limits": { 00:21:24.192 "rw_ios_per_sec": 0, 00:21:24.192 "rw_mbytes_per_sec": 0, 00:21:24.192 "r_mbytes_per_sec": 0, 00:21:24.192 "w_mbytes_per_sec": 0 00:21:24.192 }, 00:21:24.192 "claimed": true, 00:21:24.192 "claim_type": "exclusive_write", 00:21:24.192 "zoned": false, 00:21:24.192 "supported_io_types": { 00:21:24.192 "read": true, 00:21:24.192 "write": true, 00:21:24.192 "unmap": true, 00:21:24.192 "flush": true, 00:21:24.192 "reset": true, 00:21:24.192 "nvme_admin": false, 00:21:24.192 "nvme_io": false, 00:21:24.192 "nvme_io_md": false, 00:21:24.192 "write_zeroes": true, 00:21:24.192 "zcopy": true, 00:21:24.192 "get_zone_info": false, 00:21:24.192 "zone_management": false, 00:21:24.192 "zone_append": false, 00:21:24.192 "compare": false, 00:21:24.192 "compare_and_write": false, 00:21:24.192 "abort": true, 00:21:24.192 "seek_hole": false, 00:21:24.192 "seek_data": false, 00:21:24.192 "copy": true, 00:21:24.192 "nvme_iov_md": false 00:21:24.192 }, 00:21:24.192 "memory_domains": [ 00:21:24.192 { 00:21:24.192 "dma_device_id": "system", 00:21:24.192 "dma_device_type": 1 00:21:24.192 }, 00:21:24.192 { 00:21:24.192 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:24.192 "dma_device_type": 2 00:21:24.192 } 00:21:24.192 ], 00:21:24.192 "driver_specific": {} 00:21:24.192 }' 00:21:24.192 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.192 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:24.192 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:24.192 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.452 13:30:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:24.452 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:24.452 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.452 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:24.452 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:24.452 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.711 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:24.711 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:24.711 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:21:25.281 [2024-07-25 13:30:05.838190] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:21:25.281 [2024-07-25 13:30:05.838210] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:25.281 [2024-07-25 13:30:05.838248] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:25.281 [2024-07-25 13:30:05.838457] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:25.281 [2024-07-25 13:30:05.838463] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd185e0 name Existed_Raid, state offline 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 981525 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@950 -- # '[' -z 981525 ']' 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # kill -0 981525 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # uname 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 981525 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 981525' 00:21:25.281 killing process with pid 981525 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@969 -- # kill 981525 00:21:25.281 [2024-07-25 13:30:05.944078] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:25.281 13:30:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@974 -- # wait 981525 00:21:25.281 [2024-07-25 13:30:05.964534] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:25.541 13:30:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:21:25.541 00:21:25.541 real 0m28.572s 00:21:25.541 user 0m53.716s 00:21:25.541 sys 0m4.096s 00:21:25.541 13:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:25.541 13:30:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:25.541 ************************************ 00:21:25.541 END TEST raid_state_function_test_sb 00:21:25.541 ************************************ 00:21:25.541 13:30:06 bdev_raid -- bdev/bdev_raid.sh@949 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:21:25.541 13:30:06 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:21:25.541 13:30:06 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:25.541 13:30:06 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:25.541 ************************************ 00:21:25.541 START TEST raid_superblock_test 00:21:25.541 ************************************ 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 4 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=4 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@414 -- # local strip_size 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@427 -- # raid_pid=987176 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@428 -- # waitforlisten 987176 /var/tmp/spdk-raid.sock 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@831 -- # '[' -z 987176 ']' 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:25.541 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:25.541 13:30:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.541 [2024-07-25 13:30:06.220839] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:21:25.541 [2024-07-25 13:30:06.220889] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid987176 ] 00:21:25.541 [2024-07-25 13:30:06.311199] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:25.800 [2024-07-25 13:30:06.378961] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:25.800 [2024-07-25 13:30:06.418706] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:25.800 [2024-07-25 13:30:06.418729] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@864 -- # return 0 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:26.740 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:21:27.308 malloc1 00:21:27.309 13:30:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:27.878 [2024-07-25 13:30:08.471583] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:27.878 [2024-07-25 13:30:08.471619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.878 [2024-07-25 13:30:08.471630] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe1c9b0 00:21:27.878 [2024-07-25 13:30:08.471636] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.878 [2024-07-25 13:30:08.472945] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.878 [2024-07-25 13:30:08.472966] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:27.878 pt1 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:27.878 13:30:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:21:28.448 malloc2 00:21:28.448 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:29.019 [2024-07-25 13:30:09.564374] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:29.019 [2024-07-25 13:30:09.564409] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:29.019 [2024-07-25 13:30:09.564419] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe1ddb0 00:21:29.019 [2024-07-25 13:30:09.564427] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:29.019 [2024-07-25 13:30:09.565704] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:29.019 [2024-07-25 13:30:09.565725] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:29.019 pt2 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc3 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt3 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:21:29.019 malloc3 00:21:29.019 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:29.279 [2024-07-25 13:30:09.951271] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:29.279 [2024-07-25 13:30:09.951307] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:29.279 [2024-07-25 13:30:09.951316] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb4780 00:21:29.279 [2024-07-25 13:30:09.951322] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:29.279 [2024-07-25 13:30:09.952532] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:29.279 [2024-07-25 13:30:09.952557] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:29.279 pt3 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc4 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt4 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:21:29.279 13:30:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:21:29.539 malloc4 00:21:29.539 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:30.108 [2024-07-25 13:30:10.751039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:30.108 [2024-07-25 13:30:10.751069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:30.108 [2024-07-25 13:30:10.751080] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb70a0 00:21:30.108 [2024-07-25 13:30:10.751086] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:30.108 [2024-07-25 13:30:10.752324] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:30.108 [2024-07-25 13:30:10.752343] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:30.108 pt4 00:21:30.108 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:21:30.108 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:21:30.108 13:30:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:21:30.367 [2024-07-25 13:30:11.027745] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:30.367 [2024-07-25 13:30:11.028750] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:30.367 [2024-07-25 13:30:11.028792] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:30.367 [2024-07-25 13:30:11.028825] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:30.367 [2024-07-25 13:30:11.028946] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xe14970 00:21:30.367 [2024-07-25 13:30:11.028953] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:30.367 [2024-07-25 13:30:11.029109] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe12bb0 00:21:30.367 [2024-07-25 13:30:11.029226] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe14970 00:21:30.367 [2024-07-25 13:30:11.029232] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe14970 00:21:30.367 [2024-07-25 13:30:11.029312] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:30.367 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:30.367 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:30.367 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.368 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:30.938 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:30.938 "name": "raid_bdev1", 00:21:30.938 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:30.938 "strip_size_kb": 0, 00:21:30.938 "state": "online", 00:21:30.938 "raid_level": "raid1", 00:21:30.938 "superblock": true, 00:21:30.938 "num_base_bdevs": 4, 00:21:30.938 "num_base_bdevs_discovered": 4, 00:21:30.938 "num_base_bdevs_operational": 4, 00:21:30.938 "base_bdevs_list": [ 00:21:30.938 { 00:21:30.938 "name": "pt1", 00:21:30.938 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:30.938 "is_configured": true, 00:21:30.938 "data_offset": 2048, 00:21:30.938 "data_size": 63488 00:21:30.938 }, 00:21:30.938 { 00:21:30.938 "name": "pt2", 00:21:30.938 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:30.938 "is_configured": true, 00:21:30.938 "data_offset": 2048, 00:21:30.938 "data_size": 63488 00:21:30.938 }, 00:21:30.938 { 00:21:30.938 "name": "pt3", 00:21:30.938 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:30.938 "is_configured": true, 00:21:30.938 "data_offset": 2048, 00:21:30.938 "data_size": 63488 00:21:30.938 }, 00:21:30.938 { 00:21:30.938 "name": "pt4", 00:21:30.938 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:30.938 "is_configured": true, 00:21:30.938 "data_offset": 2048, 00:21:30.938 "data_size": 63488 00:21:30.938 } 00:21:30.938 ] 00:21:30.938 }' 00:21:30.938 13:30:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:30.938 13:30:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:31.986 13:30:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:32.558 [2024-07-25 13:30:13.045084] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:32.558 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:32.558 "name": "raid_bdev1", 00:21:32.558 "aliases": [ 00:21:32.558 "d0a26647-64cf-4faf-bca0-b91eaab5329b" 00:21:32.558 ], 00:21:32.558 "product_name": "Raid Volume", 00:21:32.558 "block_size": 512, 00:21:32.558 "num_blocks": 63488, 00:21:32.558 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:32.558 "assigned_rate_limits": { 00:21:32.558 "rw_ios_per_sec": 0, 00:21:32.558 "rw_mbytes_per_sec": 0, 00:21:32.558 "r_mbytes_per_sec": 0, 00:21:32.558 "w_mbytes_per_sec": 0 00:21:32.558 }, 00:21:32.558 "claimed": false, 00:21:32.558 "zoned": false, 00:21:32.558 "supported_io_types": { 00:21:32.558 "read": true, 00:21:32.558 "write": true, 00:21:32.558 "unmap": false, 00:21:32.558 "flush": false, 00:21:32.558 "reset": true, 00:21:32.558 "nvme_admin": false, 00:21:32.559 "nvme_io": false, 00:21:32.559 "nvme_io_md": false, 00:21:32.559 "write_zeroes": true, 00:21:32.559 "zcopy": false, 00:21:32.559 "get_zone_info": false, 00:21:32.559 "zone_management": false, 00:21:32.559 "zone_append": false, 00:21:32.559 "compare": false, 00:21:32.559 "compare_and_write": false, 00:21:32.559 "abort": false, 00:21:32.559 "seek_hole": false, 00:21:32.559 "seek_data": false, 00:21:32.559 "copy": false, 00:21:32.559 "nvme_iov_md": false 00:21:32.559 }, 00:21:32.559 "memory_domains": [ 00:21:32.559 { 00:21:32.559 "dma_device_id": "system", 00:21:32.559 "dma_device_type": 1 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.559 "dma_device_type": 2 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "dma_device_id": "system", 00:21:32.559 "dma_device_type": 1 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.559 "dma_device_type": 2 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "dma_device_id": "system", 00:21:32.559 "dma_device_type": 1 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.559 "dma_device_type": 2 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "dma_device_id": "system", 00:21:32.559 "dma_device_type": 1 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:32.559 "dma_device_type": 2 00:21:32.559 } 00:21:32.559 ], 00:21:32.559 "driver_specific": { 00:21:32.559 "raid": { 00:21:32.559 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:32.559 "strip_size_kb": 0, 00:21:32.559 "state": "online", 00:21:32.559 "raid_level": "raid1", 00:21:32.559 "superblock": true, 00:21:32.559 "num_base_bdevs": 4, 00:21:32.559 "num_base_bdevs_discovered": 4, 00:21:32.559 "num_base_bdevs_operational": 4, 00:21:32.559 "base_bdevs_list": [ 00:21:32.559 { 00:21:32.559 "name": "pt1", 00:21:32.559 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:32.559 "is_configured": true, 00:21:32.559 "data_offset": 2048, 00:21:32.559 "data_size": 63488 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "name": "pt2", 00:21:32.559 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:32.559 "is_configured": true, 00:21:32.559 "data_offset": 2048, 00:21:32.559 "data_size": 63488 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "name": "pt3", 00:21:32.559 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:32.559 "is_configured": true, 00:21:32.559 "data_offset": 2048, 00:21:32.559 "data_size": 63488 00:21:32.559 }, 00:21:32.559 { 00:21:32.559 "name": "pt4", 00:21:32.559 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:32.559 "is_configured": true, 00:21:32.559 "data_offset": 2048, 00:21:32.559 "data_size": 63488 00:21:32.559 } 00:21:32.559 ] 00:21:32.559 } 00:21:32.559 } 00:21:32.559 }' 00:21:32.559 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:32.559 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:32.559 pt2 00:21:32.559 pt3 00:21:32.559 pt4' 00:21:32.559 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:32.559 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:32.559 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:33.129 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:33.129 "name": "pt1", 00:21:33.129 "aliases": [ 00:21:33.129 "00000000-0000-0000-0000-000000000001" 00:21:33.129 ], 00:21:33.129 "product_name": "passthru", 00:21:33.129 "block_size": 512, 00:21:33.129 "num_blocks": 65536, 00:21:33.129 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:33.129 "assigned_rate_limits": { 00:21:33.129 "rw_ios_per_sec": 0, 00:21:33.129 "rw_mbytes_per_sec": 0, 00:21:33.129 "r_mbytes_per_sec": 0, 00:21:33.129 "w_mbytes_per_sec": 0 00:21:33.129 }, 00:21:33.129 "claimed": true, 00:21:33.129 "claim_type": "exclusive_write", 00:21:33.129 "zoned": false, 00:21:33.129 "supported_io_types": { 00:21:33.129 "read": true, 00:21:33.129 "write": true, 00:21:33.129 "unmap": true, 00:21:33.129 "flush": true, 00:21:33.129 "reset": true, 00:21:33.129 "nvme_admin": false, 00:21:33.129 "nvme_io": false, 00:21:33.129 "nvme_io_md": false, 00:21:33.129 "write_zeroes": true, 00:21:33.129 "zcopy": true, 00:21:33.130 "get_zone_info": false, 00:21:33.130 "zone_management": false, 00:21:33.130 "zone_append": false, 00:21:33.130 "compare": false, 00:21:33.130 "compare_and_write": false, 00:21:33.130 "abort": true, 00:21:33.130 "seek_hole": false, 00:21:33.130 "seek_data": false, 00:21:33.130 "copy": true, 00:21:33.130 "nvme_iov_md": false 00:21:33.130 }, 00:21:33.130 "memory_domains": [ 00:21:33.130 { 00:21:33.130 "dma_device_id": "system", 00:21:33.130 "dma_device_type": 1 00:21:33.130 }, 00:21:33.130 { 00:21:33.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:33.130 "dma_device_type": 2 00:21:33.130 } 00:21:33.130 ], 00:21:33.130 "driver_specific": { 00:21:33.130 "passthru": { 00:21:33.130 "name": "pt1", 00:21:33.130 "base_bdev_name": "malloc1" 00:21:33.130 } 00:21:33.130 } 00:21:33.130 }' 00:21:33.130 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.130 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:33.130 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:33.130 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.391 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:33.391 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:33.391 13:30:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.391 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:33.391 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:33.391 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.651 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:33.651 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:33.651 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:33.651 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:33.651 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:34.221 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:34.221 "name": "pt2", 00:21:34.221 "aliases": [ 00:21:34.221 "00000000-0000-0000-0000-000000000002" 00:21:34.221 ], 00:21:34.221 "product_name": "passthru", 00:21:34.221 "block_size": 512, 00:21:34.221 "num_blocks": 65536, 00:21:34.221 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:34.221 "assigned_rate_limits": { 00:21:34.221 "rw_ios_per_sec": 0, 00:21:34.221 "rw_mbytes_per_sec": 0, 00:21:34.221 "r_mbytes_per_sec": 0, 00:21:34.221 "w_mbytes_per_sec": 0 00:21:34.221 }, 00:21:34.221 "claimed": true, 00:21:34.221 "claim_type": "exclusive_write", 00:21:34.221 "zoned": false, 00:21:34.221 "supported_io_types": { 00:21:34.221 "read": true, 00:21:34.221 "write": true, 00:21:34.221 "unmap": true, 00:21:34.221 "flush": true, 00:21:34.221 "reset": true, 00:21:34.221 "nvme_admin": false, 00:21:34.221 "nvme_io": false, 00:21:34.221 "nvme_io_md": false, 00:21:34.221 "write_zeroes": true, 00:21:34.221 "zcopy": true, 00:21:34.221 "get_zone_info": false, 00:21:34.221 "zone_management": false, 00:21:34.221 "zone_append": false, 00:21:34.221 "compare": false, 00:21:34.221 "compare_and_write": false, 00:21:34.221 "abort": true, 00:21:34.221 "seek_hole": false, 00:21:34.221 "seek_data": false, 00:21:34.221 "copy": true, 00:21:34.221 "nvme_iov_md": false 00:21:34.221 }, 00:21:34.221 "memory_domains": [ 00:21:34.221 { 00:21:34.221 "dma_device_id": "system", 00:21:34.221 "dma_device_type": 1 00:21:34.221 }, 00:21:34.221 { 00:21:34.221 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:34.221 "dma_device_type": 2 00:21:34.221 } 00:21:34.221 ], 00:21:34.221 "driver_specific": { 00:21:34.221 "passthru": { 00:21:34.221 "name": "pt2", 00:21:34.221 "base_bdev_name": "malloc2" 00:21:34.221 } 00:21:34.221 } 00:21:34.221 }' 00:21:34.221 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.221 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:34.221 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:34.221 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.221 13:30:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:34.481 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:34.481 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.481 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:34.481 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:34.481 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.481 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:34.741 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:34.741 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:34.741 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:34.741 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.310 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.310 "name": "pt3", 00:21:35.310 "aliases": [ 00:21:35.310 "00000000-0000-0000-0000-000000000003" 00:21:35.310 ], 00:21:35.310 "product_name": "passthru", 00:21:35.310 "block_size": 512, 00:21:35.310 "num_blocks": 65536, 00:21:35.310 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:35.310 "assigned_rate_limits": { 00:21:35.310 "rw_ios_per_sec": 0, 00:21:35.310 "rw_mbytes_per_sec": 0, 00:21:35.310 "r_mbytes_per_sec": 0, 00:21:35.310 "w_mbytes_per_sec": 0 00:21:35.310 }, 00:21:35.310 "claimed": true, 00:21:35.310 "claim_type": "exclusive_write", 00:21:35.310 "zoned": false, 00:21:35.310 "supported_io_types": { 00:21:35.310 "read": true, 00:21:35.310 "write": true, 00:21:35.310 "unmap": true, 00:21:35.310 "flush": true, 00:21:35.310 "reset": true, 00:21:35.310 "nvme_admin": false, 00:21:35.310 "nvme_io": false, 00:21:35.310 "nvme_io_md": false, 00:21:35.310 "write_zeroes": true, 00:21:35.310 "zcopy": true, 00:21:35.310 "get_zone_info": false, 00:21:35.310 "zone_management": false, 00:21:35.310 "zone_append": false, 00:21:35.310 "compare": false, 00:21:35.310 "compare_and_write": false, 00:21:35.310 "abort": true, 00:21:35.310 "seek_hole": false, 00:21:35.310 "seek_data": false, 00:21:35.310 "copy": true, 00:21:35.310 "nvme_iov_md": false 00:21:35.310 }, 00:21:35.311 "memory_domains": [ 00:21:35.311 { 00:21:35.311 "dma_device_id": "system", 00:21:35.311 "dma_device_type": 1 00:21:35.311 }, 00:21:35.311 { 00:21:35.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.311 "dma_device_type": 2 00:21:35.311 } 00:21:35.311 ], 00:21:35.311 "driver_specific": { 00:21:35.311 "passthru": { 00:21:35.311 "name": "pt3", 00:21:35.311 "base_bdev_name": "malloc3" 00:21:35.311 } 00:21:35.311 } 00:21:35.311 }' 00:21:35.311 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.311 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.311 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.311 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.311 13:30:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.311 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.311 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.311 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.571 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:35.571 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.571 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:35.571 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:35.571 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:35.571 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:35.571 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:35.831 "name": "pt4", 00:21:35.831 "aliases": [ 00:21:35.831 "00000000-0000-0000-0000-000000000004" 00:21:35.831 ], 00:21:35.831 "product_name": "passthru", 00:21:35.831 "block_size": 512, 00:21:35.831 "num_blocks": 65536, 00:21:35.831 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:35.831 "assigned_rate_limits": { 00:21:35.831 "rw_ios_per_sec": 0, 00:21:35.831 "rw_mbytes_per_sec": 0, 00:21:35.831 "r_mbytes_per_sec": 0, 00:21:35.831 "w_mbytes_per_sec": 0 00:21:35.831 }, 00:21:35.831 "claimed": true, 00:21:35.831 "claim_type": "exclusive_write", 00:21:35.831 "zoned": false, 00:21:35.831 "supported_io_types": { 00:21:35.831 "read": true, 00:21:35.831 "write": true, 00:21:35.831 "unmap": true, 00:21:35.831 "flush": true, 00:21:35.831 "reset": true, 00:21:35.831 "nvme_admin": false, 00:21:35.831 "nvme_io": false, 00:21:35.831 "nvme_io_md": false, 00:21:35.831 "write_zeroes": true, 00:21:35.831 "zcopy": true, 00:21:35.831 "get_zone_info": false, 00:21:35.831 "zone_management": false, 00:21:35.831 "zone_append": false, 00:21:35.831 "compare": false, 00:21:35.831 "compare_and_write": false, 00:21:35.831 "abort": true, 00:21:35.831 "seek_hole": false, 00:21:35.831 "seek_data": false, 00:21:35.831 "copy": true, 00:21:35.831 "nvme_iov_md": false 00:21:35.831 }, 00:21:35.831 "memory_domains": [ 00:21:35.831 { 00:21:35.831 "dma_device_id": "system", 00:21:35.831 "dma_device_type": 1 00:21:35.831 }, 00:21:35.831 { 00:21:35.831 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:35.831 "dma_device_type": 2 00:21:35.831 } 00:21:35.831 ], 00:21:35.831 "driver_specific": { 00:21:35.831 "passthru": { 00:21:35.831 "name": "pt4", 00:21:35.831 "base_bdev_name": "malloc4" 00:21:35.831 } 00:21:35.831 } 00:21:35.831 }' 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:35.831 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:36.092 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:36.092 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.092 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:36.092 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:36.092 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:21:36.092 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:36.351 [2024-07-25 13:30:16.918870] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:36.351 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=d0a26647-64cf-4faf-bca0-b91eaab5329b 00:21:36.351 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@451 -- # '[' -z d0a26647-64cf-4faf-bca0-b91eaab5329b ']' 00:21:36.351 13:30:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:36.351 [2024-07-25 13:30:17.111109] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:36.352 [2024-07-25 13:30:17.111120] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:36.352 [2024-07-25 13:30:17.111156] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:36.352 [2024-07-25 13:30:17.111221] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:36.352 [2024-07-25 13:30:17.111228] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe14970 name raid_bdev1, state offline 00:21:36.352 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.352 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:21:36.612 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:21:36.612 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:21:36.612 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:36.612 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:37.182 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:37.182 13:30:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:37.752 13:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:37.752 13:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:38.322 13:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:21:38.322 13:30:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:38.892 13:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:21:38.892 13:30:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:21:39.463 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:21:39.463 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:39.463 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@650 -- # local es=0 00:21:39.463 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:39.463 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:21:39.464 [2024-07-25 13:30:20.219106] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:21:39.464 [2024-07-25 13:30:20.220218] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:21:39.464 [2024-07-25 13:30:20.220254] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:21:39.464 [2024-07-25 13:30:20.220278] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:21:39.464 [2024-07-25 13:30:20.220311] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:21:39.464 [2024-07-25 13:30:20.220341] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:21:39.464 [2024-07-25 13:30:20.220355] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:21:39.464 [2024-07-25 13:30:20.220368] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:21:39.464 [2024-07-25 13:30:20.220378] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:39.464 [2024-07-25 13:30:20.220385] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1ce50 name raid_bdev1, state configuring 00:21:39.464 request: 00:21:39.464 { 00:21:39.464 "name": "raid_bdev1", 00:21:39.464 "raid_level": "raid1", 00:21:39.464 "base_bdevs": [ 00:21:39.464 "malloc1", 00:21:39.464 "malloc2", 00:21:39.464 "malloc3", 00:21:39.464 "malloc4" 00:21:39.464 ], 00:21:39.464 "superblock": false, 00:21:39.464 "method": "bdev_raid_create", 00:21:39.464 "req_id": 1 00:21:39.464 } 00:21:39.464 Got JSON-RPC error response 00:21:39.464 response: 00:21:39.464 { 00:21:39.464 "code": -17, 00:21:39.464 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:21:39.464 } 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@653 -- # es=1 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.464 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:21:39.724 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:21:39.724 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:21:39.724 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:39.984 [2024-07-25 13:30:20.604034] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:39.984 [2024-07-25 13:30:20.604069] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:39.984 [2024-07-25 13:30:20.604082] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe1cbe0 00:21:39.984 [2024-07-25 13:30:20.604088] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:39.984 [2024-07-25 13:30:20.605365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:39.984 [2024-07-25 13:30:20.605386] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:39.984 [2024-07-25 13:30:20.605435] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:39.984 [2024-07-25 13:30:20.605454] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:39.984 pt1 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.984 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:40.244 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:40.244 "name": "raid_bdev1", 00:21:40.244 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:40.244 "strip_size_kb": 0, 00:21:40.244 "state": "configuring", 00:21:40.244 "raid_level": "raid1", 00:21:40.244 "superblock": true, 00:21:40.244 "num_base_bdevs": 4, 00:21:40.244 "num_base_bdevs_discovered": 1, 00:21:40.244 "num_base_bdevs_operational": 4, 00:21:40.244 "base_bdevs_list": [ 00:21:40.244 { 00:21:40.244 "name": "pt1", 00:21:40.244 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:40.244 "is_configured": true, 00:21:40.244 "data_offset": 2048, 00:21:40.244 "data_size": 63488 00:21:40.244 }, 00:21:40.244 { 00:21:40.244 "name": null, 00:21:40.244 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:40.244 "is_configured": false, 00:21:40.244 "data_offset": 2048, 00:21:40.244 "data_size": 63488 00:21:40.244 }, 00:21:40.244 { 00:21:40.244 "name": null, 00:21:40.244 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:40.244 "is_configured": false, 00:21:40.244 "data_offset": 2048, 00:21:40.244 "data_size": 63488 00:21:40.244 }, 00:21:40.244 { 00:21:40.244 "name": null, 00:21:40.244 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:40.244 "is_configured": false, 00:21:40.244 "data_offset": 2048, 00:21:40.244 "data_size": 63488 00:21:40.244 } 00:21:40.244 ] 00:21:40.244 }' 00:21:40.244 13:30:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:40.244 13:30:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:40.814 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@485 -- # '[' 4 -gt 2 ']' 00:21:40.814 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@487 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:40.814 [2024-07-25 13:30:21.522352] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:40.814 [2024-07-25 13:30:21.522387] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:40.814 [2024-07-25 13:30:21.522399] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfc0430 00:21:40.814 [2024-07-25 13:30:21.522406] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:40.814 [2024-07-25 13:30:21.522685] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:40.814 [2024-07-25 13:30:21.522698] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:40.814 [2024-07-25 13:30:21.522742] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:40.814 [2024-07-25 13:30:21.522755] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:40.814 pt2 00:21:40.814 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@488 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:41.074 [2024-07-25 13:30:21.718854] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@489 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.074 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.335 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:41.335 "name": "raid_bdev1", 00:21:41.335 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:41.335 "strip_size_kb": 0, 00:21:41.335 "state": "configuring", 00:21:41.335 "raid_level": "raid1", 00:21:41.335 "superblock": true, 00:21:41.335 "num_base_bdevs": 4, 00:21:41.335 "num_base_bdevs_discovered": 1, 00:21:41.335 "num_base_bdevs_operational": 4, 00:21:41.335 "base_bdevs_list": [ 00:21:41.335 { 00:21:41.335 "name": "pt1", 00:21:41.335 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:41.335 "is_configured": true, 00:21:41.335 "data_offset": 2048, 00:21:41.335 "data_size": 63488 00:21:41.335 }, 00:21:41.335 { 00:21:41.335 "name": null, 00:21:41.335 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:41.335 "is_configured": false, 00:21:41.335 "data_offset": 2048, 00:21:41.335 "data_size": 63488 00:21:41.335 }, 00:21:41.335 { 00:21:41.335 "name": null, 00:21:41.335 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:41.335 "is_configured": false, 00:21:41.335 "data_offset": 2048, 00:21:41.335 "data_size": 63488 00:21:41.335 }, 00:21:41.335 { 00:21:41.335 "name": null, 00:21:41.335 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:41.335 "is_configured": false, 00:21:41.335 "data_offset": 2048, 00:21:41.335 "data_size": 63488 00:21:41.335 } 00:21:41.335 ] 00:21:41.335 }' 00:21:41.335 13:30:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:41.335 13:30:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:41.906 13:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:21:41.906 13:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:41.906 13:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:42.166 [2024-07-25 13:30:22.885832] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:42.166 [2024-07-25 13:30:22.885868] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:42.166 [2024-07-25 13:30:22.885882] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb72d0 00:21:42.166 [2024-07-25 13:30:22.885888] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:42.166 [2024-07-25 13:30:22.886162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:42.166 [2024-07-25 13:30:22.886173] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:42.166 [2024-07-25 13:30:22.886219] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:42.166 [2024-07-25 13:30:22.886233] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:42.166 pt2 00:21:42.166 13:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:42.166 13:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:42.166 13:30:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:42.768 [2024-07-25 13:30:23.415177] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:42.768 [2024-07-25 13:30:23.415208] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:42.768 [2024-07-25 13:30:23.415219] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe14500 00:21:42.768 [2024-07-25 13:30:23.415225] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:42.768 [2024-07-25 13:30:23.415480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:42.768 [2024-07-25 13:30:23.415492] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:42.768 [2024-07-25 13:30:23.415532] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:42.768 [2024-07-25 13:30:23.415544] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:42.768 pt3 00:21:42.768 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:42.768 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:42.768 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:43.033 [2024-07-25 13:30:23.691880] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:43.033 [2024-07-25 13:30:23.691906] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:43.033 [2024-07-25 13:30:23.691917] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe17120 00:21:43.033 [2024-07-25 13:30:23.691924] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:43.033 [2024-07-25 13:30:23.692157] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:43.033 [2024-07-25 13:30:23.692168] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:43.033 [2024-07-25 13:30:23.692205] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:43.033 [2024-07-25 13:30:23.692216] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:43.033 [2024-07-25 13:30:23.692312] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xe17ce0 00:21:43.033 [2024-07-25 13:30:23.692318] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:43.033 [2024-07-25 13:30:23.692450] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb5fc0 00:21:43.033 [2024-07-25 13:30:23.692567] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe17ce0 00:21:43.033 [2024-07-25 13:30:23.692582] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe17ce0 00:21:43.033 [2024-07-25 13:30:23.692674] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:43.033 pt4 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.033 13:30:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:43.604 13:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:43.604 "name": "raid_bdev1", 00:21:43.604 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:43.604 "strip_size_kb": 0, 00:21:43.604 "state": "online", 00:21:43.604 "raid_level": "raid1", 00:21:43.604 "superblock": true, 00:21:43.604 "num_base_bdevs": 4, 00:21:43.604 "num_base_bdevs_discovered": 4, 00:21:43.604 "num_base_bdevs_operational": 4, 00:21:43.604 "base_bdevs_list": [ 00:21:43.604 { 00:21:43.604 "name": "pt1", 00:21:43.604 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:43.604 "is_configured": true, 00:21:43.604 "data_offset": 2048, 00:21:43.604 "data_size": 63488 00:21:43.604 }, 00:21:43.604 { 00:21:43.604 "name": "pt2", 00:21:43.604 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:43.604 "is_configured": true, 00:21:43.604 "data_offset": 2048, 00:21:43.604 "data_size": 63488 00:21:43.604 }, 00:21:43.604 { 00:21:43.604 "name": "pt3", 00:21:43.604 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:43.604 "is_configured": true, 00:21:43.604 "data_offset": 2048, 00:21:43.604 "data_size": 63488 00:21:43.604 }, 00:21:43.604 { 00:21:43.604 "name": "pt4", 00:21:43.604 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:43.604 "is_configured": true, 00:21:43.604 "data_offset": 2048, 00:21:43.604 "data_size": 63488 00:21:43.604 } 00:21:43.604 ] 00:21:43.604 }' 00:21:43.604 13:30:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:43.604 13:30:24 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:44.545 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:21:44.804 [2024-07-25 13:30:25.400471] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:44.805 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:21:44.805 "name": "raid_bdev1", 00:21:44.805 "aliases": [ 00:21:44.805 "d0a26647-64cf-4faf-bca0-b91eaab5329b" 00:21:44.805 ], 00:21:44.805 "product_name": "Raid Volume", 00:21:44.805 "block_size": 512, 00:21:44.805 "num_blocks": 63488, 00:21:44.805 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:44.805 "assigned_rate_limits": { 00:21:44.805 "rw_ios_per_sec": 0, 00:21:44.805 "rw_mbytes_per_sec": 0, 00:21:44.805 "r_mbytes_per_sec": 0, 00:21:44.805 "w_mbytes_per_sec": 0 00:21:44.805 }, 00:21:44.805 "claimed": false, 00:21:44.805 "zoned": false, 00:21:44.805 "supported_io_types": { 00:21:44.805 "read": true, 00:21:44.805 "write": true, 00:21:44.805 "unmap": false, 00:21:44.805 "flush": false, 00:21:44.805 "reset": true, 00:21:44.805 "nvme_admin": false, 00:21:44.805 "nvme_io": false, 00:21:44.805 "nvme_io_md": false, 00:21:44.805 "write_zeroes": true, 00:21:44.805 "zcopy": false, 00:21:44.805 "get_zone_info": false, 00:21:44.805 "zone_management": false, 00:21:44.805 "zone_append": false, 00:21:44.805 "compare": false, 00:21:44.805 "compare_and_write": false, 00:21:44.805 "abort": false, 00:21:44.805 "seek_hole": false, 00:21:44.805 "seek_data": false, 00:21:44.805 "copy": false, 00:21:44.805 "nvme_iov_md": false 00:21:44.805 }, 00:21:44.805 "memory_domains": [ 00:21:44.805 { 00:21:44.805 "dma_device_id": "system", 00:21:44.805 "dma_device_type": 1 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.805 "dma_device_type": 2 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "system", 00:21:44.805 "dma_device_type": 1 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.805 "dma_device_type": 2 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "system", 00:21:44.805 "dma_device_type": 1 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.805 "dma_device_type": 2 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "system", 00:21:44.805 "dma_device_type": 1 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:44.805 "dma_device_type": 2 00:21:44.805 } 00:21:44.805 ], 00:21:44.805 "driver_specific": { 00:21:44.805 "raid": { 00:21:44.805 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:44.805 "strip_size_kb": 0, 00:21:44.805 "state": "online", 00:21:44.805 "raid_level": "raid1", 00:21:44.805 "superblock": true, 00:21:44.805 "num_base_bdevs": 4, 00:21:44.805 "num_base_bdevs_discovered": 4, 00:21:44.805 "num_base_bdevs_operational": 4, 00:21:44.805 "base_bdevs_list": [ 00:21:44.805 { 00:21:44.805 "name": "pt1", 00:21:44.805 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:44.805 "is_configured": true, 00:21:44.805 "data_offset": 2048, 00:21:44.805 "data_size": 63488 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "name": "pt2", 00:21:44.805 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:44.805 "is_configured": true, 00:21:44.805 "data_offset": 2048, 00:21:44.805 "data_size": 63488 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "name": "pt3", 00:21:44.805 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:44.805 "is_configured": true, 00:21:44.805 "data_offset": 2048, 00:21:44.805 "data_size": 63488 00:21:44.805 }, 00:21:44.805 { 00:21:44.805 "name": "pt4", 00:21:44.805 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:44.805 "is_configured": true, 00:21:44.805 "data_offset": 2048, 00:21:44.805 "data_size": 63488 00:21:44.805 } 00:21:44.805 ] 00:21:44.805 } 00:21:44.805 } 00:21:44.805 }' 00:21:44.805 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:21:44.805 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:21:44.805 pt2 00:21:44.805 pt3 00:21:44.805 pt4' 00:21:44.805 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:44.805 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:21:44.805 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:45.066 "name": "pt1", 00:21:45.066 "aliases": [ 00:21:45.066 "00000000-0000-0000-0000-000000000001" 00:21:45.066 ], 00:21:45.066 "product_name": "passthru", 00:21:45.066 "block_size": 512, 00:21:45.066 "num_blocks": 65536, 00:21:45.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:21:45.066 "assigned_rate_limits": { 00:21:45.066 "rw_ios_per_sec": 0, 00:21:45.066 "rw_mbytes_per_sec": 0, 00:21:45.066 "r_mbytes_per_sec": 0, 00:21:45.066 "w_mbytes_per_sec": 0 00:21:45.066 }, 00:21:45.066 "claimed": true, 00:21:45.066 "claim_type": "exclusive_write", 00:21:45.066 "zoned": false, 00:21:45.066 "supported_io_types": { 00:21:45.066 "read": true, 00:21:45.066 "write": true, 00:21:45.066 "unmap": true, 00:21:45.066 "flush": true, 00:21:45.066 "reset": true, 00:21:45.066 "nvme_admin": false, 00:21:45.066 "nvme_io": false, 00:21:45.066 "nvme_io_md": false, 00:21:45.066 "write_zeroes": true, 00:21:45.066 "zcopy": true, 00:21:45.066 "get_zone_info": false, 00:21:45.066 "zone_management": false, 00:21:45.066 "zone_append": false, 00:21:45.066 "compare": false, 00:21:45.066 "compare_and_write": false, 00:21:45.066 "abort": true, 00:21:45.066 "seek_hole": false, 00:21:45.066 "seek_data": false, 00:21:45.066 "copy": true, 00:21:45.066 "nvme_iov_md": false 00:21:45.066 }, 00:21:45.066 "memory_domains": [ 00:21:45.066 { 00:21:45.066 "dma_device_id": "system", 00:21:45.066 "dma_device_type": 1 00:21:45.066 }, 00:21:45.066 { 00:21:45.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.066 "dma_device_type": 2 00:21:45.066 } 00:21:45.066 ], 00:21:45.066 "driver_specific": { 00:21:45.066 "passthru": { 00:21:45.066 "name": "pt1", 00:21:45.066 "base_bdev_name": "malloc1" 00:21:45.066 } 00:21:45.066 } 00:21:45.066 }' 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:45.066 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.327 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.327 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:45.327 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.327 13:30:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.327 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.327 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:45.327 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:21:45.327 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:45.587 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:45.587 "name": "pt2", 00:21:45.587 "aliases": [ 00:21:45.587 "00000000-0000-0000-0000-000000000002" 00:21:45.587 ], 00:21:45.587 "product_name": "passthru", 00:21:45.587 "block_size": 512, 00:21:45.587 "num_blocks": 65536, 00:21:45.587 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:45.587 "assigned_rate_limits": { 00:21:45.587 "rw_ios_per_sec": 0, 00:21:45.587 "rw_mbytes_per_sec": 0, 00:21:45.587 "r_mbytes_per_sec": 0, 00:21:45.587 "w_mbytes_per_sec": 0 00:21:45.587 }, 00:21:45.587 "claimed": true, 00:21:45.587 "claim_type": "exclusive_write", 00:21:45.587 "zoned": false, 00:21:45.587 "supported_io_types": { 00:21:45.587 "read": true, 00:21:45.587 "write": true, 00:21:45.587 "unmap": true, 00:21:45.587 "flush": true, 00:21:45.587 "reset": true, 00:21:45.587 "nvme_admin": false, 00:21:45.587 "nvme_io": false, 00:21:45.587 "nvme_io_md": false, 00:21:45.587 "write_zeroes": true, 00:21:45.587 "zcopy": true, 00:21:45.587 "get_zone_info": false, 00:21:45.587 "zone_management": false, 00:21:45.587 "zone_append": false, 00:21:45.587 "compare": false, 00:21:45.587 "compare_and_write": false, 00:21:45.587 "abort": true, 00:21:45.587 "seek_hole": false, 00:21:45.587 "seek_data": false, 00:21:45.587 "copy": true, 00:21:45.587 "nvme_iov_md": false 00:21:45.587 }, 00:21:45.587 "memory_domains": [ 00:21:45.587 { 00:21:45.587 "dma_device_id": "system", 00:21:45.587 "dma_device_type": 1 00:21:45.587 }, 00:21:45.587 { 00:21:45.587 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:45.587 "dma_device_type": 2 00:21:45.587 } 00:21:45.587 ], 00:21:45.587 "driver_specific": { 00:21:45.587 "passthru": { 00:21:45.587 "name": "pt2", 00:21:45.587 "base_bdev_name": "malloc2" 00:21:45.587 } 00:21:45.587 } 00:21:45.587 }' 00:21:45.587 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.587 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:45.587 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:45.587 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.587 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:21:45.847 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.107 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:46.107 "name": "pt3", 00:21:46.107 "aliases": [ 00:21:46.107 "00000000-0000-0000-0000-000000000003" 00:21:46.107 ], 00:21:46.107 "product_name": "passthru", 00:21:46.107 "block_size": 512, 00:21:46.107 "num_blocks": 65536, 00:21:46.107 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:46.107 "assigned_rate_limits": { 00:21:46.107 "rw_ios_per_sec": 0, 00:21:46.107 "rw_mbytes_per_sec": 0, 00:21:46.107 "r_mbytes_per_sec": 0, 00:21:46.107 "w_mbytes_per_sec": 0 00:21:46.107 }, 00:21:46.107 "claimed": true, 00:21:46.107 "claim_type": "exclusive_write", 00:21:46.107 "zoned": false, 00:21:46.107 "supported_io_types": { 00:21:46.107 "read": true, 00:21:46.107 "write": true, 00:21:46.107 "unmap": true, 00:21:46.107 "flush": true, 00:21:46.107 "reset": true, 00:21:46.107 "nvme_admin": false, 00:21:46.107 "nvme_io": false, 00:21:46.107 "nvme_io_md": false, 00:21:46.107 "write_zeroes": true, 00:21:46.107 "zcopy": true, 00:21:46.107 "get_zone_info": false, 00:21:46.107 "zone_management": false, 00:21:46.107 "zone_append": false, 00:21:46.107 "compare": false, 00:21:46.107 "compare_and_write": false, 00:21:46.107 "abort": true, 00:21:46.107 "seek_hole": false, 00:21:46.107 "seek_data": false, 00:21:46.107 "copy": true, 00:21:46.107 "nvme_iov_md": false 00:21:46.107 }, 00:21:46.107 "memory_domains": [ 00:21:46.107 { 00:21:46.107 "dma_device_id": "system", 00:21:46.107 "dma_device_type": 1 00:21:46.107 }, 00:21:46.107 { 00:21:46.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.107 "dma_device_type": 2 00:21:46.107 } 00:21:46.107 ], 00:21:46.107 "driver_specific": { 00:21:46.107 "passthru": { 00:21:46.107 "name": "pt3", 00:21:46.107 "base_bdev_name": "malloc3" 00:21:46.107 } 00:21:46.107 } 00:21:46.107 }' 00:21:46.107 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.107 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.107 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:46.107 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.107 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.367 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.367 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.367 13:30:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.367 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.367 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.367 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.367 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.367 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:46.367 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:46.367 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:46.627 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:46.627 "name": "pt4", 00:21:46.627 "aliases": [ 00:21:46.627 "00000000-0000-0000-0000-000000000004" 00:21:46.627 ], 00:21:46.627 "product_name": "passthru", 00:21:46.627 "block_size": 512, 00:21:46.627 "num_blocks": 65536, 00:21:46.627 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:46.627 "assigned_rate_limits": { 00:21:46.627 "rw_ios_per_sec": 0, 00:21:46.627 "rw_mbytes_per_sec": 0, 00:21:46.627 "r_mbytes_per_sec": 0, 00:21:46.627 "w_mbytes_per_sec": 0 00:21:46.627 }, 00:21:46.627 "claimed": true, 00:21:46.627 "claim_type": "exclusive_write", 00:21:46.627 "zoned": false, 00:21:46.627 "supported_io_types": { 00:21:46.627 "read": true, 00:21:46.627 "write": true, 00:21:46.627 "unmap": true, 00:21:46.627 "flush": true, 00:21:46.627 "reset": true, 00:21:46.627 "nvme_admin": false, 00:21:46.627 "nvme_io": false, 00:21:46.627 "nvme_io_md": false, 00:21:46.627 "write_zeroes": true, 00:21:46.627 "zcopy": true, 00:21:46.627 "get_zone_info": false, 00:21:46.627 "zone_management": false, 00:21:46.627 "zone_append": false, 00:21:46.627 "compare": false, 00:21:46.627 "compare_and_write": false, 00:21:46.627 "abort": true, 00:21:46.627 "seek_hole": false, 00:21:46.627 "seek_data": false, 00:21:46.627 "copy": true, 00:21:46.627 "nvme_iov_md": false 00:21:46.627 }, 00:21:46.627 "memory_domains": [ 00:21:46.627 { 00:21:46.627 "dma_device_id": "system", 00:21:46.627 "dma_device_type": 1 00:21:46.627 }, 00:21:46.627 { 00:21:46.627 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:46.627 "dma_device_type": 2 00:21:46.627 } 00:21:46.627 ], 00:21:46.627 "driver_specific": { 00:21:46.627 "passthru": { 00:21:46.627 "name": "pt4", 00:21:46.627 "base_bdev_name": "malloc4" 00:21:46.627 } 00:21:46.627 } 00:21:46.627 }' 00:21:46.627 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.627 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:46.627 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:46.627 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:46.887 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:21:47.147 [2024-07-25 13:30:27.806569] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:47.147 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@502 -- # '[' d0a26647-64cf-4faf-bca0-b91eaab5329b '!=' d0a26647-64cf-4faf-bca0-b91eaab5329b ']' 00:21:47.147 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:21:47.147 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:47.147 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:47.147 13:30:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:47.407 [2024-07-25 13:30:27.998818] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:47.407 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:47.667 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:47.667 "name": "raid_bdev1", 00:21:47.667 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:47.667 "strip_size_kb": 0, 00:21:47.667 "state": "online", 00:21:47.667 "raid_level": "raid1", 00:21:47.667 "superblock": true, 00:21:47.667 "num_base_bdevs": 4, 00:21:47.667 "num_base_bdevs_discovered": 3, 00:21:47.667 "num_base_bdevs_operational": 3, 00:21:47.667 "base_bdevs_list": [ 00:21:47.667 { 00:21:47.667 "name": null, 00:21:47.667 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:47.667 "is_configured": false, 00:21:47.667 "data_offset": 2048, 00:21:47.667 "data_size": 63488 00:21:47.667 }, 00:21:47.667 { 00:21:47.667 "name": "pt2", 00:21:47.667 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:47.667 "is_configured": true, 00:21:47.667 "data_offset": 2048, 00:21:47.667 "data_size": 63488 00:21:47.667 }, 00:21:47.667 { 00:21:47.667 "name": "pt3", 00:21:47.667 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:47.667 "is_configured": true, 00:21:47.667 "data_offset": 2048, 00:21:47.667 "data_size": 63488 00:21:47.667 }, 00:21:47.667 { 00:21:47.667 "name": "pt4", 00:21:47.667 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:47.667 "is_configured": true, 00:21:47.667 "data_offset": 2048, 00:21:47.667 "data_size": 63488 00:21:47.667 } 00:21:47.667 ] 00:21:47.667 }' 00:21:47.667 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:47.667 13:30:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:48.237 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:48.237 [2024-07-25 13:30:28.913108] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:48.238 [2024-07-25 13:30:28.913125] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:48.238 [2024-07-25 13:30:28.913161] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:48.238 [2024-07-25 13:30:28.913210] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:48.238 [2024-07-25 13:30:28.913216] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe17ce0 name raid_bdev1, state offline 00:21:48.238 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:48.238 13:30:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:21:48.498 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:21:48.498 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:21:48.498 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:21:48.498 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:21:48.498 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:48.758 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:21:48.758 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:21:48.758 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:48.758 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:21:48.758 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:21:48.758 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:49.018 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:21:49.019 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:21:49.019 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:21:49.019 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:21:49.019 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:49.279 [2024-07-25 13:30:29.855458] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:49.279 [2024-07-25 13:30:29.855492] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:49.279 [2024-07-25 13:30:29.855506] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfb72d0 00:21:49.279 [2024-07-25 13:30:29.855513] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:49.279 [2024-07-25 13:30:29.856806] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:49.279 [2024-07-25 13:30:29.856828] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:49.279 [2024-07-25 13:30:29.856877] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:49.279 [2024-07-25 13:30:29.856896] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:49.279 pt2 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:49.279 13:30:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:49.280 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:49.280 "name": "raid_bdev1", 00:21:49.280 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:49.280 "strip_size_kb": 0, 00:21:49.280 "state": "configuring", 00:21:49.280 "raid_level": "raid1", 00:21:49.280 "superblock": true, 00:21:49.280 "num_base_bdevs": 4, 00:21:49.280 "num_base_bdevs_discovered": 1, 00:21:49.280 "num_base_bdevs_operational": 3, 00:21:49.280 "base_bdevs_list": [ 00:21:49.280 { 00:21:49.280 "name": null, 00:21:49.280 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:49.280 "is_configured": false, 00:21:49.280 "data_offset": 2048, 00:21:49.280 "data_size": 63488 00:21:49.280 }, 00:21:49.280 { 00:21:49.280 "name": "pt2", 00:21:49.280 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:49.280 "is_configured": true, 00:21:49.280 "data_offset": 2048, 00:21:49.280 "data_size": 63488 00:21:49.280 }, 00:21:49.280 { 00:21:49.280 "name": null, 00:21:49.280 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:49.280 "is_configured": false, 00:21:49.280 "data_offset": 2048, 00:21:49.280 "data_size": 63488 00:21:49.280 }, 00:21:49.280 { 00:21:49.280 "name": null, 00:21:49.280 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:49.280 "is_configured": false, 00:21:49.280 "data_offset": 2048, 00:21:49.280 "data_size": 63488 00:21:49.280 } 00:21:49.280 ] 00:21:49.280 }' 00:21:49.280 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:49.280 13:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:49.850 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:21:49.850 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:21:49.850 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:50.110 [2024-07-25 13:30:30.773788] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:50.110 [2024-07-25 13:30:30.773827] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.110 [2024-07-25 13:30:30.773840] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xfbf960 00:21:50.110 [2024-07-25 13:30:30.773847] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.110 [2024-07-25 13:30:30.774122] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.110 [2024-07-25 13:30:30.774133] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:50.110 [2024-07-25 13:30:30.774184] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:50.110 [2024-07-25 13:30:30.774197] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:50.110 pt3 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@530 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.110 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:50.370 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:50.370 "name": "raid_bdev1", 00:21:50.370 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:50.370 "strip_size_kb": 0, 00:21:50.370 "state": "configuring", 00:21:50.370 "raid_level": "raid1", 00:21:50.370 "superblock": true, 00:21:50.370 "num_base_bdevs": 4, 00:21:50.370 "num_base_bdevs_discovered": 2, 00:21:50.370 "num_base_bdevs_operational": 3, 00:21:50.370 "base_bdevs_list": [ 00:21:50.370 { 00:21:50.371 "name": null, 00:21:50.371 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:50.371 "is_configured": false, 00:21:50.371 "data_offset": 2048, 00:21:50.371 "data_size": 63488 00:21:50.371 }, 00:21:50.371 { 00:21:50.371 "name": "pt2", 00:21:50.371 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:50.371 "is_configured": true, 00:21:50.371 "data_offset": 2048, 00:21:50.371 "data_size": 63488 00:21:50.371 }, 00:21:50.371 { 00:21:50.371 "name": "pt3", 00:21:50.371 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:50.371 "is_configured": true, 00:21:50.371 "data_offset": 2048, 00:21:50.371 "data_size": 63488 00:21:50.371 }, 00:21:50.371 { 00:21:50.371 "name": null, 00:21:50.371 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:50.371 "is_configured": false, 00:21:50.371 "data_offset": 2048, 00:21:50.371 "data_size": 63488 00:21:50.371 } 00:21:50.371 ] 00:21:50.371 }' 00:21:50.371 13:30:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:50.371 13:30:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i++ )) 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # i=3 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:50.942 [2024-07-25 13:30:31.676066] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:50.942 [2024-07-25 13:30:31.676106] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.942 [2024-07-25 13:30:31.676118] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe13d40 00:21:50.942 [2024-07-25 13:30:31.676124] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.942 [2024-07-25 13:30:31.676393] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.942 [2024-07-25 13:30:31.676405] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:50.942 [2024-07-25 13:30:31.676449] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:50.942 [2024-07-25 13:30:31.676465] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:50.942 [2024-07-25 13:30:31.676574] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xe14480 00:21:50.942 [2024-07-25 13:30:31.676585] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:50.942 [2024-07-25 13:30:31.676754] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xfb54b0 00:21:50.942 [2024-07-25 13:30:31.676863] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe14480 00:21:50.942 [2024-07-25 13:30:31.676868] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe14480 00:21:50.942 [2024-07-25 13:30:31.676943] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:50.942 pt4 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:50.942 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.202 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.202 "name": "raid_bdev1", 00:21:51.202 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:51.202 "strip_size_kb": 0, 00:21:51.202 "state": "online", 00:21:51.202 "raid_level": "raid1", 00:21:51.202 "superblock": true, 00:21:51.202 "num_base_bdevs": 4, 00:21:51.202 "num_base_bdevs_discovered": 3, 00:21:51.202 "num_base_bdevs_operational": 3, 00:21:51.202 "base_bdevs_list": [ 00:21:51.202 { 00:21:51.202 "name": null, 00:21:51.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:51.202 "is_configured": false, 00:21:51.202 "data_offset": 2048, 00:21:51.202 "data_size": 63488 00:21:51.202 }, 00:21:51.202 { 00:21:51.202 "name": "pt2", 00:21:51.202 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:51.202 "is_configured": true, 00:21:51.202 "data_offset": 2048, 00:21:51.202 "data_size": 63488 00:21:51.202 }, 00:21:51.202 { 00:21:51.202 "name": "pt3", 00:21:51.202 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:51.202 "is_configured": true, 00:21:51.202 "data_offset": 2048, 00:21:51.202 "data_size": 63488 00:21:51.202 }, 00:21:51.202 { 00:21:51.202 "name": "pt4", 00:21:51.202 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:51.202 "is_configured": true, 00:21:51.202 "data_offset": 2048, 00:21:51.202 "data_size": 63488 00:21:51.202 } 00:21:51.202 ] 00:21:51.202 }' 00:21:51.202 13:30:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.202 13:30:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:51.774 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:52.034 [2024-07-25 13:30:32.630487] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:52.034 [2024-07-25 13:30:32.630505] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:52.034 [2024-07-25 13:30:32.630541] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:52.034 [2024-07-25 13:30:32.630593] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:52.034 [2024-07-25 13:30:32.630599] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe14480 name raid_bdev1, state offline 00:21:52.034 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:21:52.034 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.294 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:21:52.294 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:21:52.294 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@547 -- # '[' 4 -gt 2 ']' 00:21:52.294 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@549 -- # i=3 00:21:52.294 13:30:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@550 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:52.294 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:52.554 [2024-07-25 13:30:33.211937] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:52.554 [2024-07-25 13:30:33.211966] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:52.554 [2024-07-25 13:30:33.211975] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe14480 00:21:52.554 [2024-07-25 13:30:33.211981] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:52.555 [2024-07-25 13:30:33.213247] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:52.555 [2024-07-25 13:30:33.213268] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:52.555 [2024-07-25 13:30:33.213315] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:52.555 [2024-07-25 13:30:33.213333] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:52.555 [2024-07-25 13:30:33.213408] bdev_raid.c:3665:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:52.555 [2024-07-25 13:30:33.213416] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:52.555 [2024-07-25 13:30:33.213424] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe15970 name raid_bdev1, state configuring 00:21:52.555 [2024-07-25 13:30:33.213438] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:52.555 [2024-07-25 13:30:33.213495] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:52.555 pt1 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 4 -gt 2 ']' 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@560 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.555 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:52.814 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:52.814 "name": "raid_bdev1", 00:21:52.814 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:52.814 "strip_size_kb": 0, 00:21:52.814 "state": "configuring", 00:21:52.814 "raid_level": "raid1", 00:21:52.814 "superblock": true, 00:21:52.814 "num_base_bdevs": 4, 00:21:52.814 "num_base_bdevs_discovered": 2, 00:21:52.814 "num_base_bdevs_operational": 3, 00:21:52.814 "base_bdevs_list": [ 00:21:52.814 { 00:21:52.814 "name": null, 00:21:52.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:52.814 "is_configured": false, 00:21:52.814 "data_offset": 2048, 00:21:52.814 "data_size": 63488 00:21:52.814 }, 00:21:52.814 { 00:21:52.814 "name": "pt2", 00:21:52.814 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:52.814 "is_configured": true, 00:21:52.814 "data_offset": 2048, 00:21:52.814 "data_size": 63488 00:21:52.814 }, 00:21:52.814 { 00:21:52.814 "name": "pt3", 00:21:52.814 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:52.814 "is_configured": true, 00:21:52.814 "data_offset": 2048, 00:21:52.814 "data_size": 63488 00:21:52.814 }, 00:21:52.814 { 00:21:52.814 "name": null, 00:21:52.814 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:52.815 "is_configured": false, 00:21:52.815 "data_offset": 2048, 00:21:52.815 "data_size": 63488 00:21:52.815 } 00:21:52.815 ] 00:21:52.815 }' 00:21:52.815 13:30:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:52.815 13:30:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:53.384 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:53.384 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@561 -- # [[ false == \f\a\l\s\e ]] 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:53.644 [2024-07-25 13:30:34.386918] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:53.644 [2024-07-25 13:30:34.386951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:53.644 [2024-07-25 13:30:34.386961] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe16f80 00:21:53.644 [2024-07-25 13:30:34.386967] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:53.644 [2024-07-25 13:30:34.387235] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:53.644 [2024-07-25 13:30:34.387247] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:53.644 [2024-07-25 13:30:34.387292] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:53.644 [2024-07-25 13:30:34.387307] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:53.644 [2024-07-25 13:30:34.387396] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xe1ce50 00:21:53.644 [2024-07-25 13:30:34.387402] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:53.644 [2024-07-25 13:30:34.387538] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe16190 00:21:53.644 [2024-07-25 13:30:34.387647] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe1ce50 00:21:53.644 [2024-07-25 13:30:34.387652] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe1ce50 00:21:53.644 [2024-07-25 13:30:34.387724] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:53.644 pt4 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:53.644 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:53.905 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:53.905 "name": "raid_bdev1", 00:21:53.905 "uuid": "d0a26647-64cf-4faf-bca0-b91eaab5329b", 00:21:53.905 "strip_size_kb": 0, 00:21:53.905 "state": "online", 00:21:53.905 "raid_level": "raid1", 00:21:53.905 "superblock": true, 00:21:53.905 "num_base_bdevs": 4, 00:21:53.905 "num_base_bdevs_discovered": 3, 00:21:53.905 "num_base_bdevs_operational": 3, 00:21:53.905 "base_bdevs_list": [ 00:21:53.905 { 00:21:53.905 "name": null, 00:21:53.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:53.905 "is_configured": false, 00:21:53.905 "data_offset": 2048, 00:21:53.905 "data_size": 63488 00:21:53.905 }, 00:21:53.905 { 00:21:53.905 "name": "pt2", 00:21:53.905 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:53.905 "is_configured": true, 00:21:53.905 "data_offset": 2048, 00:21:53.905 "data_size": 63488 00:21:53.905 }, 00:21:53.905 { 00:21:53.905 "name": "pt3", 00:21:53.905 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:53.905 "is_configured": true, 00:21:53.905 "data_offset": 2048, 00:21:53.905 "data_size": 63488 00:21:53.905 }, 00:21:53.905 { 00:21:53.905 "name": "pt4", 00:21:53.905 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:53.905 "is_configured": true, 00:21:53.905 "data_offset": 2048, 00:21:53.905 "data_size": 63488 00:21:53.905 } 00:21:53.905 ] 00:21:53.905 }' 00:21:53.905 13:30:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:53.905 13:30:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.476 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:54.476 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:54.737 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:21:54.737 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:54.737 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:21:54.737 [2024-07-25 13:30:35.509985] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:54.737 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@573 -- # '[' d0a26647-64cf-4faf-bca0-b91eaab5329b '!=' d0a26647-64cf-4faf-bca0-b91eaab5329b ']' 00:21:54.737 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@578 -- # killprocess 987176 00:21:54.737 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@950 -- # '[' -z 987176 ']' 00:21:54.737 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # kill -0 987176 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # uname 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 987176 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 987176' 00:21:54.998 killing process with pid 987176 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@969 -- # kill 987176 00:21:54.998 [2024-07-25 13:30:35.585121] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:54.998 [2024-07-25 13:30:35.585157] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:54.998 [2024-07-25 13:30:35.585206] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:54.998 [2024-07-25 13:30:35.585212] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe1ce50 name raid_bdev1, state offline 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@974 -- # wait 987176 00:21:54.998 [2024-07-25 13:30:35.605833] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@580 -- # return 0 00:21:54.998 00:21:54.998 real 0m29.560s 00:21:54.998 user 0m55.505s 00:21:54.998 sys 0m3.841s 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:21:54.998 13:30:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:54.998 ************************************ 00:21:54.998 END TEST raid_superblock_test 00:21:54.998 ************************************ 00:21:54.998 13:30:35 bdev_raid -- bdev/bdev_raid.sh@950 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:21:54.998 13:30:35 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:21:54.998 13:30:35 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:21:54.998 13:30:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:55.258 ************************************ 00:21:55.258 START TEST raid_read_error_test 00:21:55.258 ************************************ 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 read 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=read 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.3cbJlIlwqi 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=993030 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 993030 /var/tmp/spdk-raid.sock 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@831 -- # '[' -z 993030 ']' 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:55.259 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:21:55.259 13:30:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:55.259 [2024-07-25 13:30:35.872088] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:21:55.259 [2024-07-25 13:30:35.872136] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid993030 ] 00:21:55.259 [2024-07-25 13:30:35.959964] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:55.259 [2024-07-25 13:30:36.023175] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:21:55.519 [2024-07-25 13:30:36.072764] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:55.519 [2024-07-25 13:30:36.072785] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:56.089 13:30:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:21:56.089 13:30:36 bdev_raid.raid_read_error_test -- common/autotest_common.sh@864 -- # return 0 00:21:56.089 13:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:56.089 13:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:56.350 BaseBdev1_malloc 00:21:56.350 13:30:36 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:56.350 true 00:21:56.350 13:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:56.609 [2024-07-25 13:30:37.239981] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:56.609 [2024-07-25 13:30:37.240012] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:56.609 [2024-07-25 13:30:37.240022] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b4b2a0 00:21:56.609 [2024-07-25 13:30:37.240029] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:56.610 [2024-07-25 13:30:37.241293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:56.610 [2024-07-25 13:30:37.241313] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:56.610 BaseBdev1 00:21:56.610 13:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:56.610 13:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:56.870 BaseBdev2_malloc 00:21:56.870 13:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:56.870 true 00:21:56.870 13:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:57.131 [2024-07-25 13:30:37.799161] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:57.131 [2024-07-25 13:30:37.799191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:57.131 [2024-07-25 13:30:37.799202] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0a420 00:21:57.131 [2024-07-25 13:30:37.799213] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:57.131 [2024-07-25 13:30:37.800364] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:57.131 [2024-07-25 13:30:37.800382] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:57.131 BaseBdev2 00:21:57.131 13:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:57.131 13:30:37 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:57.391 BaseBdev3_malloc 00:21:57.391 13:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:57.651 true 00:21:57.651 13:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:57.651 [2024-07-25 13:30:38.382477] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:57.651 [2024-07-25 13:30:38.382505] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:57.651 [2024-07-25 13:30:38.382518] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0bf70 00:21:57.651 [2024-07-25 13:30:38.382525] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:57.651 [2024-07-25 13:30:38.383711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:57.651 [2024-07-25 13:30:38.383729] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:57.651 BaseBdev3 00:21:57.651 13:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:21:57.651 13:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:57.911 BaseBdev4_malloc 00:21:57.911 13:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:58.172 true 00:21:58.172 13:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:58.432 [2024-07-25 13:30:38.977846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:58.432 [2024-07-25 13:30:38.977875] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:58.432 [2024-07-25 13:30:38.977886] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c0f1e0 00:21:58.432 [2024-07-25 13:30:38.977893] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:58.432 [2024-07-25 13:30:38.979086] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:58.432 [2024-07-25 13:30:38.979104] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:58.432 BaseBdev4 00:21:58.432 13:30:38 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:58.432 [2024-07-25 13:30:39.154338] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:58.432 [2024-07-25 13:30:39.155337] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:58.432 [2024-07-25 13:30:39.155391] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:58.432 [2024-07-25 13:30:39.155438] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:58.432 [2024-07-25 13:30:39.155611] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c0f800 00:21:58.432 [2024-07-25 13:30:39.155618] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:58.432 [2024-07-25 13:30:39.155774] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c0a950 00:21:58.432 [2024-07-25 13:30:39.155895] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c0f800 00:21:58.432 [2024-07-25 13:30:39.155901] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c0f800 00:21:58.432 [2024-07-25 13:30:39.155986] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.432 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.692 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.692 "name": "raid_bdev1", 00:21:58.692 "uuid": "d025e8d9-7635-4777-ab95-e6cdcaf30bc7", 00:21:58.692 "strip_size_kb": 0, 00:21:58.692 "state": "online", 00:21:58.692 "raid_level": "raid1", 00:21:58.692 "superblock": true, 00:21:58.692 "num_base_bdevs": 4, 00:21:58.692 "num_base_bdevs_discovered": 4, 00:21:58.692 "num_base_bdevs_operational": 4, 00:21:58.692 "base_bdevs_list": [ 00:21:58.692 { 00:21:58.692 "name": "BaseBdev1", 00:21:58.692 "uuid": "f8c2d32d-8657-5fca-9fe9-d1c38bea87ca", 00:21:58.692 "is_configured": true, 00:21:58.692 "data_offset": 2048, 00:21:58.692 "data_size": 63488 00:21:58.692 }, 00:21:58.692 { 00:21:58.692 "name": "BaseBdev2", 00:21:58.692 "uuid": "4e7ea46c-f0e2-50ad-a15e-1082c9457f49", 00:21:58.692 "is_configured": true, 00:21:58.692 "data_offset": 2048, 00:21:58.692 "data_size": 63488 00:21:58.692 }, 00:21:58.692 { 00:21:58.692 "name": "BaseBdev3", 00:21:58.692 "uuid": "046223fc-2d2f-529c-a6ba-c746a9beeb24", 00:21:58.692 "is_configured": true, 00:21:58.692 "data_offset": 2048, 00:21:58.692 "data_size": 63488 00:21:58.692 }, 00:21:58.692 { 00:21:58.692 "name": "BaseBdev4", 00:21:58.692 "uuid": "4bd589cd-4fa7-54f0-90a6-b3956d86eb75", 00:21:58.692 "is_configured": true, 00:21:58.692 "data_offset": 2048, 00:21:58.692 "data_size": 63488 00:21:58.692 } 00:21:58.692 ] 00:21:58.692 }' 00:21:58.692 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.692 13:30:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:59.262 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:59.262 13:30:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:21:59.262 [2024-07-25 13:30:39.968619] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c102b0 00:22:00.203 13:30:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:22:00.463 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@846 -- # [[ read = \w\r\i\t\e ]] 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@849 -- # expected_num_base_bdevs=4 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.464 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.724 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.724 "name": "raid_bdev1", 00:22:00.724 "uuid": "d025e8d9-7635-4777-ab95-e6cdcaf30bc7", 00:22:00.724 "strip_size_kb": 0, 00:22:00.724 "state": "online", 00:22:00.724 "raid_level": "raid1", 00:22:00.724 "superblock": true, 00:22:00.724 "num_base_bdevs": 4, 00:22:00.724 "num_base_bdevs_discovered": 4, 00:22:00.724 "num_base_bdevs_operational": 4, 00:22:00.724 "base_bdevs_list": [ 00:22:00.724 { 00:22:00.724 "name": "BaseBdev1", 00:22:00.724 "uuid": "f8c2d32d-8657-5fca-9fe9-d1c38bea87ca", 00:22:00.724 "is_configured": true, 00:22:00.724 "data_offset": 2048, 00:22:00.724 "data_size": 63488 00:22:00.724 }, 00:22:00.724 { 00:22:00.724 "name": "BaseBdev2", 00:22:00.724 "uuid": "4e7ea46c-f0e2-50ad-a15e-1082c9457f49", 00:22:00.724 "is_configured": true, 00:22:00.724 "data_offset": 2048, 00:22:00.724 "data_size": 63488 00:22:00.724 }, 00:22:00.724 { 00:22:00.724 "name": "BaseBdev3", 00:22:00.724 "uuid": "046223fc-2d2f-529c-a6ba-c746a9beeb24", 00:22:00.724 "is_configured": true, 00:22:00.724 "data_offset": 2048, 00:22:00.724 "data_size": 63488 00:22:00.724 }, 00:22:00.724 { 00:22:00.724 "name": "BaseBdev4", 00:22:00.724 "uuid": "4bd589cd-4fa7-54f0-90a6-b3956d86eb75", 00:22:00.724 "is_configured": true, 00:22:00.724 "data_offset": 2048, 00:22:00.724 "data_size": 63488 00:22:00.724 } 00:22:00.724 ] 00:22:00.724 }' 00:22:00.724 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.724 13:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.293 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:01.293 [2024-07-25 13:30:41.946227] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:01.293 [2024-07-25 13:30:41.946261] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:01.293 [2024-07-25 13:30:41.948833] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:01.293 [2024-07-25 13:30:41.948861] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:01.293 [2024-07-25 13:30:41.948954] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:01.293 [2024-07-25 13:30:41.948962] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c0f800 name raid_bdev1, state offline 00:22:01.293 0 00:22:01.293 13:30:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 993030 00:22:01.294 13:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@950 -- # '[' -z 993030 ']' 00:22:01.294 13:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # kill -0 993030 00:22:01.294 13:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # uname 00:22:01.294 13:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:01.294 13:30:41 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 993030 00:22:01.294 13:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:01.294 13:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:01.294 13:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 993030' 00:22:01.294 killing process with pid 993030 00:22:01.294 13:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@969 -- # kill 993030 00:22:01.294 [2024-07-25 13:30:42.029198] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:01.294 13:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@974 -- # wait 993030 00:22:01.294 [2024-07-25 13:30:42.046446] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.3cbJlIlwqi 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:01.555 00:22:01.555 real 0m6.377s 00:22:01.555 user 0m10.232s 00:22:01.555 sys 0m0.891s 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:01.555 13:30:42 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.555 ************************************ 00:22:01.555 END TEST raid_read_error_test 00:22:01.555 ************************************ 00:22:01.555 13:30:42 bdev_raid -- bdev/bdev_raid.sh@951 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:22:01.555 13:30:42 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:22:01.555 13:30:42 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:01.555 13:30:42 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:01.555 ************************************ 00:22:01.555 START TEST raid_write_error_test 00:22:01.555 ************************************ 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # raid_io_error_test raid1 4 write 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@804 -- # local raid_level=raid1 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # local num_base_bdevs=4 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@806 -- # local error_io_type=write 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i = 1 )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev1 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev2 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev3 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # echo BaseBdev4 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i++ )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # (( i <= num_base_bdevs )) 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # local base_bdevs 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # local raid_bdev_name=raid_bdev1 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # local strip_size 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@810 -- # local create_arg 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@811 -- # local bdevperf_log 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # local fail_per_s 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # '[' raid1 '!=' raid1 ']' 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@818 -- # strip_size=0 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # mktemp -p /raidtest 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@821 -- # bdevperf_log=/raidtest/tmp.1JZBqxla4m 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # raid_pid=994079 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@825 -- # waitforlisten 994079 /var/tmp/spdk-raid.sock 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@831 -- # '[' -z 994079 ']' 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:01.555 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:01.555 13:30:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:01.555 [2024-07-25 13:30:42.328784] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:22:01.555 [2024-07-25 13:30:42.328834] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid994079 ] 00:22:01.816 [2024-07-25 13:30:42.417729] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:01.816 [2024-07-25 13:30:42.482073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:01.816 [2024-07-25 13:30:42.521488] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:01.816 [2024-07-25 13:30:42.521513] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:02.385 13:30:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:02.386 13:30:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@864 -- # return 0 00:22:02.386 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:22:02.386 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:02.646 BaseBdev1_malloc 00:22:02.646 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:22:02.646 true 00:22:02.905 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:22:02.905 [2024-07-25 13:30:43.579726] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:22:02.905 [2024-07-25 13:30:43.579761] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:02.905 [2024-07-25 13:30:43.579771] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa572a0 00:22:02.905 [2024-07-25 13:30:43.579777] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:02.905 [2024-07-25 13:30:43.581020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:02.905 [2024-07-25 13:30:43.581043] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:02.905 BaseBdev1 00:22:02.905 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:22:02.905 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:03.165 BaseBdev2_malloc 00:22:03.166 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:22:03.166 true 00:22:03.166 13:30:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:22:03.426 [2024-07-25 13:30:44.034417] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:22:03.426 [2024-07-25 13:30:44.034442] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.426 [2024-07-25 13:30:44.034452] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb16420 00:22:03.426 [2024-07-25 13:30:44.034459] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.426 [2024-07-25 13:30:44.035598] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.426 [2024-07-25 13:30:44.035617] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:03.426 BaseBdev2 00:22:03.426 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:22:03.426 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:22:03.426 BaseBdev3_malloc 00:22:03.426 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:22:03.686 true 00:22:03.686 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:22:03.950 [2024-07-25 13:30:44.581380] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:22:03.950 [2024-07-25 13:30:44.581408] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:03.950 [2024-07-25 13:30:44.581419] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb17f70 00:22:03.950 [2024-07-25 13:30:44.581425] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:03.950 [2024-07-25 13:30:44.582576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:03.950 [2024-07-25 13:30:44.582594] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:22:03.950 BaseBdev3 00:22:03.950 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@828 -- # for bdev in "${base_bdevs[@]}" 00:22:03.950 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:22:04.256 BaseBdev4_malloc 00:22:04.256 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:22:04.256 true 00:22:04.256 13:30:44 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:22:04.522 [2024-07-25 13:30:45.048437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:22:04.522 [2024-07-25 13:30:45.048464] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:04.522 [2024-07-25 13:30:45.048474] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb1b1e0 00:22:04.522 [2024-07-25 13:30:45.048480] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:04.522 [2024-07-25 13:30:45.049634] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:04.522 [2024-07-25 13:30:45.049652] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:22:04.522 BaseBdev4 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:22:04.522 [2024-07-25 13:30:45.200848] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:04.522 [2024-07-25 13:30:45.201822] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:04.522 [2024-07-25 13:30:45.201874] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:22:04.522 [2024-07-25 13:30:45.201919] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:22:04.522 [2024-07-25 13:30:45.202086] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xb1b800 00:22:04.522 [2024-07-25 13:30:45.202093] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:04.522 [2024-07-25 13:30:45.202237] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb16950 00:22:04.522 [2024-07-25 13:30:45.202356] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb1b800 00:22:04.522 [2024-07-25 13:30:45.202362] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb1b800 00:22:04.522 [2024-07-25 13:30:45.202445] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@836 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:04.522 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:04.782 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:04.782 "name": "raid_bdev1", 00:22:04.782 "uuid": "127484ab-663d-40cb-91bb-6b07fa24b293", 00:22:04.782 "strip_size_kb": 0, 00:22:04.782 "state": "online", 00:22:04.782 "raid_level": "raid1", 00:22:04.782 "superblock": true, 00:22:04.782 "num_base_bdevs": 4, 00:22:04.782 "num_base_bdevs_discovered": 4, 00:22:04.782 "num_base_bdevs_operational": 4, 00:22:04.782 "base_bdevs_list": [ 00:22:04.782 { 00:22:04.782 "name": "BaseBdev1", 00:22:04.782 "uuid": "b337938c-ab33-59ed-b377-cbb11789495b", 00:22:04.782 "is_configured": true, 00:22:04.782 "data_offset": 2048, 00:22:04.782 "data_size": 63488 00:22:04.782 }, 00:22:04.782 { 00:22:04.782 "name": "BaseBdev2", 00:22:04.782 "uuid": "abd3b5d6-de61-549a-b2d4-5d570a101671", 00:22:04.782 "is_configured": true, 00:22:04.782 "data_offset": 2048, 00:22:04.782 "data_size": 63488 00:22:04.782 }, 00:22:04.782 { 00:22:04.782 "name": "BaseBdev3", 00:22:04.782 "uuid": "2830c72b-9b26-55eb-9026-010a901a1cf8", 00:22:04.782 "is_configured": true, 00:22:04.782 "data_offset": 2048, 00:22:04.782 "data_size": 63488 00:22:04.782 }, 00:22:04.782 { 00:22:04.782 "name": "BaseBdev4", 00:22:04.782 "uuid": "cfa9bcb6-8a1c-5903-984f-42fda94e59c7", 00:22:04.782 "is_configured": true, 00:22:04.782 "data_offset": 2048, 00:22:04.782 "data_size": 63488 00:22:04.782 } 00:22:04.782 ] 00:22:04.782 }' 00:22:04.782 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:04.782 13:30:45 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:05.352 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@840 -- # sleep 1 00:22:05.352 13:30:45 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:05.352 [2024-07-25 13:30:46.011135] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb1c2b0 00:22:06.292 13:30:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:22:06.552 [2024-07-25 13:30:47.105218] bdev_raid.c:2263:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:22:06.552 [2024-07-25 13:30:47.105265] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:06.552 [2024-07-25 13:30:47.105463] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb1c2b0 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # local expected_num_base_bdevs 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ raid1 = \r\a\i\d\1 ]] 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@846 -- # [[ write = \w\r\i\t\e ]] 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # expected_num_base_bdevs=3 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@851 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:06.552 "name": "raid_bdev1", 00:22:06.552 "uuid": "127484ab-663d-40cb-91bb-6b07fa24b293", 00:22:06.552 "strip_size_kb": 0, 00:22:06.552 "state": "online", 00:22:06.552 "raid_level": "raid1", 00:22:06.552 "superblock": true, 00:22:06.552 "num_base_bdevs": 4, 00:22:06.552 "num_base_bdevs_discovered": 3, 00:22:06.552 "num_base_bdevs_operational": 3, 00:22:06.552 "base_bdevs_list": [ 00:22:06.552 { 00:22:06.552 "name": null, 00:22:06.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:06.552 "is_configured": false, 00:22:06.552 "data_offset": 2048, 00:22:06.552 "data_size": 63488 00:22:06.552 }, 00:22:06.552 { 00:22:06.552 "name": "BaseBdev2", 00:22:06.552 "uuid": "abd3b5d6-de61-549a-b2d4-5d570a101671", 00:22:06.552 "is_configured": true, 00:22:06.552 "data_offset": 2048, 00:22:06.552 "data_size": 63488 00:22:06.552 }, 00:22:06.552 { 00:22:06.552 "name": "BaseBdev3", 00:22:06.552 "uuid": "2830c72b-9b26-55eb-9026-010a901a1cf8", 00:22:06.552 "is_configured": true, 00:22:06.552 "data_offset": 2048, 00:22:06.552 "data_size": 63488 00:22:06.552 }, 00:22:06.552 { 00:22:06.552 "name": "BaseBdev4", 00:22:06.552 "uuid": "cfa9bcb6-8a1c-5903-984f-42fda94e59c7", 00:22:06.552 "is_configured": true, 00:22:06.552 "data_offset": 2048, 00:22:06.552 "data_size": 63488 00:22:06.552 } 00:22:06.552 ] 00:22:06.552 }' 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:06.552 13:30:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.121 13:30:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@853 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:07.382 [2024-07-25 13:30:48.018843] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:07.382 [2024-07-25 13:30:48.018871] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:07.382 [2024-07-25 13:30:48.021436] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:07.382 [2024-07-25 13:30:48.021460] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:07.382 [2024-07-25 13:30:48.021536] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:07.382 [2024-07-25 13:30:48.021542] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb1b800 name raid_bdev1, state offline 00:22:07.382 0 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@855 -- # killprocess 994079 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@950 -- # '[' -z 994079 ']' 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # kill -0 994079 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # uname 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 994079 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 994079' 00:22:07.382 killing process with pid 994079 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@969 -- # kill 994079 00:22:07.382 [2024-07-25 13:30:48.087117] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:07.382 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@974 -- # wait 994079 00:22:07.382 [2024-07-25 13:30:48.104487] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep -v Job /raidtest/tmp.1JZBqxla4m 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # grep raid_bdev1 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # awk '{print $6}' 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@859 -- # fail_per_s=0.00 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@860 -- # has_redundancy raid1 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@861 -- # [[ 0.00 = \0\.\0\0 ]] 00:22:07.643 00:22:07.643 real 0m5.982s 00:22:07.643 user 0m9.512s 00:22:07.643 sys 0m0.875s 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:07.643 13:30:48 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.643 ************************************ 00:22:07.643 END TEST raid_write_error_test 00:22:07.643 ************************************ 00:22:07.643 13:30:48 bdev_raid -- bdev/bdev_raid.sh@955 -- # '[' true = true ']' 00:22:07.643 13:30:48 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:22:07.643 13:30:48 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:22:07.643 13:30:48 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:22:07.643 13:30:48 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:07.643 13:30:48 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:07.643 ************************************ 00:22:07.643 START TEST raid_rebuild_test 00:22:07.643 ************************************ 00:22:07.643 13:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false false true 00:22:07.643 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:22:07.643 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:22:07.643 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:22:07.643 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:22:07.643 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=995309 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 995309 /var/tmp/spdk-raid.sock 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 995309 ']' 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:07.644 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:07.644 13:30:48 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:07.644 [2024-07-25 13:30:48.385057] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:22:07.644 [2024-07-25 13:30:48.385116] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid995309 ] 00:22:07.644 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:07.644 Zero copy mechanism will not be used. 00:22:07.904 [2024-07-25 13:30:48.476260] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:07.904 [2024-07-25 13:30:48.552688] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:07.904 [2024-07-25 13:30:48.595834] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:07.904 [2024-07-25 13:30:48.595859] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:08.474 13:30:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:08.474 13:30:49 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:22:08.474 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:22:08.474 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:08.734 BaseBdev1_malloc 00:22:08.734 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:08.734 [2024-07-25 13:30:49.485945] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:08.734 [2024-07-25 13:30:49.485982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:08.734 [2024-07-25 13:30:49.485995] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x96cd10 00:22:08.734 [2024-07-25 13:30:49.486002] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:08.734 [2024-07-25 13:30:49.487215] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:08.734 [2024-07-25 13:30:49.487233] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:08.734 BaseBdev1 00:22:08.734 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:22:08.734 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:08.994 BaseBdev2_malloc 00:22:08.994 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:09.253 [2024-07-25 13:30:49.796480] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:09.253 [2024-07-25 13:30:49.796507] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.253 [2024-07-25 13:30:49.796520] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x96d6d0 00:22:09.253 [2024-07-25 13:30:49.796526] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.253 [2024-07-25 13:30:49.797661] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.253 [2024-07-25 13:30:49.797680] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:09.253 BaseBdev2 00:22:09.253 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:09.253 spare_malloc 00:22:09.253 13:30:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:09.513 spare_delay 00:22:09.513 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:09.772 [2024-07-25 13:30:50.307528] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:09.772 [2024-07-25 13:30:50.307561] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.772 [2024-07-25 13:30:50.307573] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x964ac0 00:22:09.772 [2024-07-25 13:30:50.307580] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.772 [2024-07-25 13:30:50.308740] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.772 [2024-07-25 13:30:50.308758] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:09.772 spare 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:09.773 [2024-07-25 13:30:50.504127] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:09.773 [2024-07-25 13:30:50.505094] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:09.773 [2024-07-25 13:30:50.505155] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x965c70 00:22:09.773 [2024-07-25 13:30:50.505161] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:09.773 [2024-07-25 13:30:50.505310] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x96d960 00:22:09.773 [2024-07-25 13:30:50.505414] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x965c70 00:22:09.773 [2024-07-25 13:30:50.505419] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x965c70 00:22:09.773 [2024-07-25 13:30:50.505498] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.773 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.033 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.033 "name": "raid_bdev1", 00:22:10.033 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:10.033 "strip_size_kb": 0, 00:22:10.033 "state": "online", 00:22:10.033 "raid_level": "raid1", 00:22:10.033 "superblock": false, 00:22:10.033 "num_base_bdevs": 2, 00:22:10.033 "num_base_bdevs_discovered": 2, 00:22:10.033 "num_base_bdevs_operational": 2, 00:22:10.033 "base_bdevs_list": [ 00:22:10.033 { 00:22:10.033 "name": "BaseBdev1", 00:22:10.033 "uuid": "c7dbaf35-338d-5357-90d3-f4e0865e0b5e", 00:22:10.033 "is_configured": true, 00:22:10.033 "data_offset": 0, 00:22:10.033 "data_size": 65536 00:22:10.033 }, 00:22:10.033 { 00:22:10.033 "name": "BaseBdev2", 00:22:10.033 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:10.033 "is_configured": true, 00:22:10.033 "data_offset": 0, 00:22:10.033 "data_size": 65536 00:22:10.033 } 00:22:10.033 ] 00:22:10.033 }' 00:22:10.033 13:30:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.033 13:30:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:10.603 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:10.603 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:22:10.864 [2024-07-25 13:30:51.450717] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:10.864 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:22:10.864 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.864 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:11.124 [2024-07-25 13:30:51.831508] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x96d960 00:22:11.124 /dev/nbd0 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:11.124 1+0 records in 00:22:11.124 1+0 records out 00:22:11.124 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288053 s, 14.2 MB/s 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:22:11.124 13:30:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:22:21.115 65536+0 records in 00:22:21.115 65536+0 records out 00:22:21.115 33554432 bytes (34 MB, 32 MiB) copied, 8.59391 s, 3.9 MB/s 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:21.115 [2024-07-25 13:31:00.680722] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:21.115 [2024-07-25 13:31:00.857604] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:21.115 13:31:00 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:21.115 13:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:21.115 "name": "raid_bdev1", 00:22:21.115 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:21.115 "strip_size_kb": 0, 00:22:21.115 "state": "online", 00:22:21.115 "raid_level": "raid1", 00:22:21.115 "superblock": false, 00:22:21.115 "num_base_bdevs": 2, 00:22:21.115 "num_base_bdevs_discovered": 1, 00:22:21.115 "num_base_bdevs_operational": 1, 00:22:21.115 "base_bdevs_list": [ 00:22:21.115 { 00:22:21.115 "name": null, 00:22:21.116 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:21.116 "is_configured": false, 00:22:21.116 "data_offset": 0, 00:22:21.116 "data_size": 65536 00:22:21.116 }, 00:22:21.116 { 00:22:21.116 "name": "BaseBdev2", 00:22:21.116 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:21.116 "is_configured": true, 00:22:21.116 "data_offset": 0, 00:22:21.116 "data_size": 65536 00:22:21.116 } 00:22:21.116 ] 00:22:21.116 }' 00:22:21.116 13:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:21.116 13:31:01 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:21.116 13:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:21.116 [2024-07-25 13:31:01.799997] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:21.116 [2024-07-25 13:31:01.803436] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x965500 00:22:21.116 [2024-07-25 13:31:01.805041] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:21.116 13:31:01 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:22.055 13:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:22.055 13:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:22.055 13:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:22.055 13:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:22.055 13:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:22.055 13:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.055 13:31:02 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.316 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:22.316 "name": "raid_bdev1", 00:22:22.316 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:22.316 "strip_size_kb": 0, 00:22:22.316 "state": "online", 00:22:22.316 "raid_level": "raid1", 00:22:22.316 "superblock": false, 00:22:22.316 "num_base_bdevs": 2, 00:22:22.316 "num_base_bdevs_discovered": 2, 00:22:22.316 "num_base_bdevs_operational": 2, 00:22:22.316 "process": { 00:22:22.316 "type": "rebuild", 00:22:22.316 "target": "spare", 00:22:22.316 "progress": { 00:22:22.316 "blocks": 22528, 00:22:22.316 "percent": 34 00:22:22.316 } 00:22:22.316 }, 00:22:22.316 "base_bdevs_list": [ 00:22:22.316 { 00:22:22.316 "name": "spare", 00:22:22.316 "uuid": "01a7f85d-0b2c-5ddc-891a-ff65bda586b8", 00:22:22.316 "is_configured": true, 00:22:22.316 "data_offset": 0, 00:22:22.316 "data_size": 65536 00:22:22.316 }, 00:22:22.316 { 00:22:22.316 "name": "BaseBdev2", 00:22:22.316 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:22.316 "is_configured": true, 00:22:22.316 "data_offset": 0, 00:22:22.316 "data_size": 65536 00:22:22.316 } 00:22:22.316 ] 00:22:22.316 }' 00:22:22.316 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:22.316 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:22.316 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:22.576 [2024-07-25 13:31:03.297561] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:22.576 [2024-07-25 13:31:03.313882] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:22.576 [2024-07-25 13:31:03.313912] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:22.576 [2024-07-25 13:31:03.313923] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:22.576 [2024-07-25 13:31:03.313928] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.576 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.836 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.836 "name": "raid_bdev1", 00:22:22.836 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:22.836 "strip_size_kb": 0, 00:22:22.836 "state": "online", 00:22:22.836 "raid_level": "raid1", 00:22:22.836 "superblock": false, 00:22:22.836 "num_base_bdevs": 2, 00:22:22.836 "num_base_bdevs_discovered": 1, 00:22:22.836 "num_base_bdevs_operational": 1, 00:22:22.836 "base_bdevs_list": [ 00:22:22.836 { 00:22:22.836 "name": null, 00:22:22.836 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.836 "is_configured": false, 00:22:22.836 "data_offset": 0, 00:22:22.836 "data_size": 65536 00:22:22.836 }, 00:22:22.836 { 00:22:22.836 "name": "BaseBdev2", 00:22:22.836 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:22.836 "is_configured": true, 00:22:22.836 "data_offset": 0, 00:22:22.836 "data_size": 65536 00:22:22.836 } 00:22:22.836 ] 00:22:22.836 }' 00:22:22.836 13:31:03 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.836 13:31:03 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:23.406 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:23.406 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:23.406 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:23.406 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:23.406 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:23.406 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.406 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.665 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:23.665 "name": "raid_bdev1", 00:22:23.665 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:23.665 "strip_size_kb": 0, 00:22:23.665 "state": "online", 00:22:23.665 "raid_level": "raid1", 00:22:23.665 "superblock": false, 00:22:23.665 "num_base_bdevs": 2, 00:22:23.665 "num_base_bdevs_discovered": 1, 00:22:23.665 "num_base_bdevs_operational": 1, 00:22:23.665 "base_bdevs_list": [ 00:22:23.665 { 00:22:23.665 "name": null, 00:22:23.665 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.665 "is_configured": false, 00:22:23.665 "data_offset": 0, 00:22:23.665 "data_size": 65536 00:22:23.665 }, 00:22:23.665 { 00:22:23.665 "name": "BaseBdev2", 00:22:23.665 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:23.665 "is_configured": true, 00:22:23.665 "data_offset": 0, 00:22:23.665 "data_size": 65536 00:22:23.665 } 00:22:23.665 ] 00:22:23.665 }' 00:22:23.665 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:23.665 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:23.665 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:23.665 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:23.665 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:23.925 [2024-07-25 13:31:04.516749] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:23.925 [2024-07-25 13:31:04.520150] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x964590 00:22:23.925 [2024-07-25 13:31:04.521289] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:23.925 13:31:04 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:22:24.864 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:24.864 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:24.864 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:24.864 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:24.864 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:24.864 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:24.864 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:25.154 "name": "raid_bdev1", 00:22:25.154 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:25.154 "strip_size_kb": 0, 00:22:25.154 "state": "online", 00:22:25.154 "raid_level": "raid1", 00:22:25.154 "superblock": false, 00:22:25.154 "num_base_bdevs": 2, 00:22:25.154 "num_base_bdevs_discovered": 2, 00:22:25.154 "num_base_bdevs_operational": 2, 00:22:25.154 "process": { 00:22:25.154 "type": "rebuild", 00:22:25.154 "target": "spare", 00:22:25.154 "progress": { 00:22:25.154 "blocks": 22528, 00:22:25.154 "percent": 34 00:22:25.154 } 00:22:25.154 }, 00:22:25.154 "base_bdevs_list": [ 00:22:25.154 { 00:22:25.154 "name": "spare", 00:22:25.154 "uuid": "01a7f85d-0b2c-5ddc-891a-ff65bda586b8", 00:22:25.154 "is_configured": true, 00:22:25.154 "data_offset": 0, 00:22:25.154 "data_size": 65536 00:22:25.154 }, 00:22:25.154 { 00:22:25.154 "name": "BaseBdev2", 00:22:25.154 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:25.154 "is_configured": true, 00:22:25.154 "data_offset": 0, 00:22:25.154 "data_size": 65536 00:22:25.154 } 00:22:25.154 ] 00:22:25.154 }' 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=722 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:25.154 13:31:05 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:25.413 13:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:25.413 "name": "raid_bdev1", 00:22:25.413 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:25.413 "strip_size_kb": 0, 00:22:25.413 "state": "online", 00:22:25.413 "raid_level": "raid1", 00:22:25.413 "superblock": false, 00:22:25.413 "num_base_bdevs": 2, 00:22:25.413 "num_base_bdevs_discovered": 2, 00:22:25.413 "num_base_bdevs_operational": 2, 00:22:25.413 "process": { 00:22:25.413 "type": "rebuild", 00:22:25.413 "target": "spare", 00:22:25.413 "progress": { 00:22:25.413 "blocks": 28672, 00:22:25.413 "percent": 43 00:22:25.413 } 00:22:25.413 }, 00:22:25.413 "base_bdevs_list": [ 00:22:25.413 { 00:22:25.413 "name": "spare", 00:22:25.413 "uuid": "01a7f85d-0b2c-5ddc-891a-ff65bda586b8", 00:22:25.413 "is_configured": true, 00:22:25.413 "data_offset": 0, 00:22:25.413 "data_size": 65536 00:22:25.413 }, 00:22:25.413 { 00:22:25.413 "name": "BaseBdev2", 00:22:25.413 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:25.413 "is_configured": true, 00:22:25.413 "data_offset": 0, 00:22:25.413 "data_size": 65536 00:22:25.413 } 00:22:25.413 ] 00:22:25.413 }' 00:22:25.413 13:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:25.413 13:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:25.414 13:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:25.414 13:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:25.414 13:31:06 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:26.354 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:26.614 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:26.615 "name": "raid_bdev1", 00:22:26.615 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:26.615 "strip_size_kb": 0, 00:22:26.615 "state": "online", 00:22:26.615 "raid_level": "raid1", 00:22:26.615 "superblock": false, 00:22:26.615 "num_base_bdevs": 2, 00:22:26.615 "num_base_bdevs_discovered": 2, 00:22:26.615 "num_base_bdevs_operational": 2, 00:22:26.615 "process": { 00:22:26.615 "type": "rebuild", 00:22:26.615 "target": "spare", 00:22:26.615 "progress": { 00:22:26.615 "blocks": 55296, 00:22:26.615 "percent": 84 00:22:26.615 } 00:22:26.615 }, 00:22:26.615 "base_bdevs_list": [ 00:22:26.615 { 00:22:26.615 "name": "spare", 00:22:26.615 "uuid": "01a7f85d-0b2c-5ddc-891a-ff65bda586b8", 00:22:26.615 "is_configured": true, 00:22:26.615 "data_offset": 0, 00:22:26.615 "data_size": 65536 00:22:26.615 }, 00:22:26.615 { 00:22:26.615 "name": "BaseBdev2", 00:22:26.615 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:26.615 "is_configured": true, 00:22:26.615 "data_offset": 0, 00:22:26.615 "data_size": 65536 00:22:26.615 } 00:22:26.615 ] 00:22:26.615 }' 00:22:26.615 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:26.615 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:26.615 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:26.615 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:26.615 13:31:07 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:22:27.185 [2024-07-25 13:31:07.740054] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:27.185 [2024-07-25 13:31:07.740099] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:27.185 [2024-07-25 13:31:07.740125] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.755 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.016 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:28.016 "name": "raid_bdev1", 00:22:28.016 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:28.016 "strip_size_kb": 0, 00:22:28.016 "state": "online", 00:22:28.016 "raid_level": "raid1", 00:22:28.016 "superblock": false, 00:22:28.016 "num_base_bdevs": 2, 00:22:28.016 "num_base_bdevs_discovered": 2, 00:22:28.016 "num_base_bdevs_operational": 2, 00:22:28.016 "base_bdevs_list": [ 00:22:28.016 { 00:22:28.016 "name": "spare", 00:22:28.016 "uuid": "01a7f85d-0b2c-5ddc-891a-ff65bda586b8", 00:22:28.016 "is_configured": true, 00:22:28.016 "data_offset": 0, 00:22:28.016 "data_size": 65536 00:22:28.016 }, 00:22:28.016 { 00:22:28.016 "name": "BaseBdev2", 00:22:28.016 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:28.017 "is_configured": true, 00:22:28.017 "data_offset": 0, 00:22:28.017 "data_size": 65536 00:22:28.017 } 00:22:28.017 ] 00:22:28.017 }' 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.017 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:28.278 "name": "raid_bdev1", 00:22:28.278 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:28.278 "strip_size_kb": 0, 00:22:28.278 "state": "online", 00:22:28.278 "raid_level": "raid1", 00:22:28.278 "superblock": false, 00:22:28.278 "num_base_bdevs": 2, 00:22:28.278 "num_base_bdevs_discovered": 2, 00:22:28.278 "num_base_bdevs_operational": 2, 00:22:28.278 "base_bdevs_list": [ 00:22:28.278 { 00:22:28.278 "name": "spare", 00:22:28.278 "uuid": "01a7f85d-0b2c-5ddc-891a-ff65bda586b8", 00:22:28.278 "is_configured": true, 00:22:28.278 "data_offset": 0, 00:22:28.278 "data_size": 65536 00:22:28.278 }, 00:22:28.278 { 00:22:28.278 "name": "BaseBdev2", 00:22:28.278 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:28.278 "is_configured": true, 00:22:28.278 "data_offset": 0, 00:22:28.278 "data_size": 65536 00:22:28.278 } 00:22:28.278 ] 00:22:28.278 }' 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.278 13:31:08 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.538 13:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.538 "name": "raid_bdev1", 00:22:28.538 "uuid": "3517c23f-919f-4b6a-ae02-c135a2054335", 00:22:28.538 "strip_size_kb": 0, 00:22:28.538 "state": "online", 00:22:28.538 "raid_level": "raid1", 00:22:28.538 "superblock": false, 00:22:28.538 "num_base_bdevs": 2, 00:22:28.538 "num_base_bdevs_discovered": 2, 00:22:28.538 "num_base_bdevs_operational": 2, 00:22:28.538 "base_bdevs_list": [ 00:22:28.538 { 00:22:28.538 "name": "spare", 00:22:28.538 "uuid": "01a7f85d-0b2c-5ddc-891a-ff65bda586b8", 00:22:28.538 "is_configured": true, 00:22:28.538 "data_offset": 0, 00:22:28.538 "data_size": 65536 00:22:28.538 }, 00:22:28.538 { 00:22:28.538 "name": "BaseBdev2", 00:22:28.538 "uuid": "fc000d0e-abc6-5e96-a11a-6fba045bece5", 00:22:28.538 "is_configured": true, 00:22:28.538 "data_offset": 0, 00:22:28.538 "data_size": 65536 00:22:28.538 } 00:22:28.538 ] 00:22:28.538 }' 00:22:28.538 13:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.538 13:31:09 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:29.109 13:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:29.109 [2024-07-25 13:31:09.861291] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:29.109 [2024-07-25 13:31:09.861311] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:29.109 [2024-07-25 13:31:09.861356] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:29.109 [2024-07-25 13:31:09.861397] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:29.109 [2024-07-25 13:31:09.861403] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x965c70 name raid_bdev1, state offline 00:22:29.109 13:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:29.109 13:31:09 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.369 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:29.628 /dev/nbd0 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.628 1+0 records in 00:22:29.628 1+0 records out 00:22:29.628 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0002939 s, 13.9 MB/s 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.628 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:29.887 /dev/nbd1 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:29.887 1+0 records in 00:22:29.887 1+0 records out 00:22:29.887 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294613 s, 13.9 MB/s 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:29.887 13:31:10 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:29.888 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:29.888 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:29.888 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:29.888 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:22:29.888 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:29.888 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:30.148 13:31:10 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:22:30.407 13:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 995309 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 995309 ']' 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 995309 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 995309 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 995309' 00:22:30.408 killing process with pid 995309 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 995309 00:22:30.408 Received shutdown signal, test time was about 60.000000 seconds 00:22:30.408 00:22:30.408 Latency(us) 00:22:30.408 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:30.408 =================================================================================================================== 00:22:30.408 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:30.408 [2024-07-25 13:31:11.091598] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:30.408 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 995309 00:22:30.408 [2024-07-25 13:31:11.106201] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:22:30.668 00:22:30.668 real 0m22.904s 00:22:30.668 user 0m29.111s 00:22:30.668 sys 0m4.327s 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:22:30.668 ************************************ 00:22:30.668 END TEST raid_rebuild_test 00:22:30.668 ************************************ 00:22:30.668 13:31:11 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:22:30.668 13:31:11 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:22:30.668 13:31:11 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:22:30.668 13:31:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:30.668 ************************************ 00:22:30.668 START TEST raid_rebuild_test_sb 00:22:30.668 ************************************ 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=999282 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 999282 /var/tmp/spdk-raid.sock 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 999282 ']' 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:30.668 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:22:30.668 13:31:11 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:30.668 [2024-07-25 13:31:11.366361] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:22:30.668 [2024-07-25 13:31:11.366406] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid999282 ] 00:22:30.668 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:30.668 Zero copy mechanism will not be used. 00:22:30.668 [2024-07-25 13:31:11.452787] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:30.928 [2024-07-25 13:31:11.516230] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:22:30.928 [2024-07-25 13:31:11.554664] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:30.928 [2024-07-25 13:31:11.554688] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:31.499 13:31:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:22:31.499 13:31:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:22:31.499 13:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:22:31.499 13:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:31.759 BaseBdev1_malloc 00:22:31.759 13:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:32.018 [2024-07-25 13:31:12.572790] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:32.018 [2024-07-25 13:31:12.572824] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.018 [2024-07-25 13:31:12.572837] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f4d10 00:22:32.019 [2024-07-25 13:31:12.572844] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.019 [2024-07-25 13:31:12.574102] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.019 [2024-07-25 13:31:12.574123] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:32.019 BaseBdev1 00:22:32.019 13:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:22:32.019 13:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:32.019 BaseBdev2_malloc 00:22:32.019 13:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:32.279 [2024-07-25 13:31:12.951739] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:32.279 [2024-07-25 13:31:12.951769] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.279 [2024-07-25 13:31:12.951782] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f56d0 00:22:32.279 [2024-07-25 13:31:12.951789] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.279 [2024-07-25 13:31:12.952975] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.279 [2024-07-25 13:31:12.952994] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:32.279 BaseBdev2 00:22:32.279 13:31:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:32.539 spare_malloc 00:22:32.539 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:32.799 spare_delay 00:22:32.799 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:32.799 [2024-07-25 13:31:13.519278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:32.799 [2024-07-25 13:31:13.519306] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:32.799 [2024-07-25 13:31:13.519318] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26ecac0 00:22:32.799 [2024-07-25 13:31:13.519324] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:32.799 [2024-07-25 13:31:13.520529] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:32.799 [2024-07-25 13:31:13.520553] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:32.799 spare 00:22:32.799 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:33.071 [2024-07-25 13:31:13.707771] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:33.071 [2024-07-25 13:31:13.708771] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:33.071 [2024-07-25 13:31:13.708878] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x26edc70 00:22:33.071 [2024-07-25 13:31:13.708886] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:33.071 [2024-07-25 13:31:13.709040] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26f5960 00:22:33.071 [2024-07-25 13:31:13.709147] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26edc70 00:22:33.071 [2024-07-25 13:31:13.709152] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26edc70 00:22:33.071 [2024-07-25 13:31:13.709232] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.071 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:33.368 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:33.368 "name": "raid_bdev1", 00:22:33.368 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:33.368 "strip_size_kb": 0, 00:22:33.368 "state": "online", 00:22:33.368 "raid_level": "raid1", 00:22:33.368 "superblock": true, 00:22:33.368 "num_base_bdevs": 2, 00:22:33.368 "num_base_bdevs_discovered": 2, 00:22:33.368 "num_base_bdevs_operational": 2, 00:22:33.368 "base_bdevs_list": [ 00:22:33.368 { 00:22:33.368 "name": "BaseBdev1", 00:22:33.368 "uuid": "9a0f9442-8be8-5ea7-9dd4-59b1a0bdf734", 00:22:33.368 "is_configured": true, 00:22:33.368 "data_offset": 2048, 00:22:33.368 "data_size": 63488 00:22:33.368 }, 00:22:33.368 { 00:22:33.368 "name": "BaseBdev2", 00:22:33.368 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:33.368 "is_configured": true, 00:22:33.368 "data_offset": 2048, 00:22:33.368 "data_size": 63488 00:22:33.368 } 00:22:33.368 ] 00:22:33.368 }' 00:22:33.368 13:31:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:33.368 13:31:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:33.648 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:33.648 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:22:33.908 [2024-07-25 13:31:14.610245] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:33.908 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:22:33.908 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:33.908 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.167 13:31:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:22:34.427 [2024-07-25 13:31:15.003065] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26f5960 00:22:34.427 /dev/nbd0 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:34.427 1+0 records in 00:22:34.427 1+0 records out 00:22:34.427 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00029788 s, 13.8 MB/s 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:22:34.427 13:31:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:22:42.552 63488+0 records in 00:22:42.552 63488+0 records out 00:22:42.552 32505856 bytes (33 MB, 31 MiB) copied, 7.3665 s, 4.4 MB/s 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:42.552 [2024-07-25 13:31:22.628889] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:42.552 [2024-07-25 13:31:22.805361] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:42.552 13:31:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:42.552 13:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:42.552 "name": "raid_bdev1", 00:22:42.552 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:42.552 "strip_size_kb": 0, 00:22:42.552 "state": "online", 00:22:42.552 "raid_level": "raid1", 00:22:42.552 "superblock": true, 00:22:42.552 "num_base_bdevs": 2, 00:22:42.552 "num_base_bdevs_discovered": 1, 00:22:42.552 "num_base_bdevs_operational": 1, 00:22:42.552 "base_bdevs_list": [ 00:22:42.552 { 00:22:42.552 "name": null, 00:22:42.552 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:42.552 "is_configured": false, 00:22:42.552 "data_offset": 2048, 00:22:42.552 "data_size": 63488 00:22:42.552 }, 00:22:42.552 { 00:22:42.552 "name": "BaseBdev2", 00:22:42.552 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:42.552 "is_configured": true, 00:22:42.552 "data_offset": 2048, 00:22:42.552 "data_size": 63488 00:22:42.552 } 00:22:42.552 ] 00:22:42.552 }' 00:22:42.552 13:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:42.552 13:31:23 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:42.812 13:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:43.071 [2024-07-25 13:31:23.727691] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:43.071 [2024-07-25 13:31:23.731153] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ed500 00:22:43.071 [2024-07-25 13:31:23.732786] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:43.071 13:31:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:44.010 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:44.010 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:44.010 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:44.010 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:44.010 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:44.010 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.010 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.270 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:44.270 "name": "raid_bdev1", 00:22:44.270 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:44.270 "strip_size_kb": 0, 00:22:44.270 "state": "online", 00:22:44.270 "raid_level": "raid1", 00:22:44.270 "superblock": true, 00:22:44.270 "num_base_bdevs": 2, 00:22:44.270 "num_base_bdevs_discovered": 2, 00:22:44.270 "num_base_bdevs_operational": 2, 00:22:44.270 "process": { 00:22:44.270 "type": "rebuild", 00:22:44.270 "target": "spare", 00:22:44.270 "progress": { 00:22:44.270 "blocks": 22528, 00:22:44.270 "percent": 35 00:22:44.270 } 00:22:44.270 }, 00:22:44.270 "base_bdevs_list": [ 00:22:44.270 { 00:22:44.270 "name": "spare", 00:22:44.270 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:44.270 "is_configured": true, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 }, 00:22:44.270 { 00:22:44.270 "name": "BaseBdev2", 00:22:44.270 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:44.270 "is_configured": true, 00:22:44.270 "data_offset": 2048, 00:22:44.270 "data_size": 63488 00:22:44.270 } 00:22:44.270 ] 00:22:44.270 }' 00:22:44.270 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:44.270 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:44.270 13:31:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:44.270 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:44.270 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:44.530 [2024-07-25 13:31:25.213583] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:44.530 [2024-07-25 13:31:25.241638] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:44.530 [2024-07-25 13:31:25.241666] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:44.530 [2024-07-25 13:31:25.241676] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:44.530 [2024-07-25 13:31:25.241680] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:44.530 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:44.789 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:44.789 "name": "raid_bdev1", 00:22:44.789 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:44.789 "strip_size_kb": 0, 00:22:44.789 "state": "online", 00:22:44.789 "raid_level": "raid1", 00:22:44.789 "superblock": true, 00:22:44.789 "num_base_bdevs": 2, 00:22:44.789 "num_base_bdevs_discovered": 1, 00:22:44.789 "num_base_bdevs_operational": 1, 00:22:44.789 "base_bdevs_list": [ 00:22:44.789 { 00:22:44.789 "name": null, 00:22:44.789 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:44.790 "is_configured": false, 00:22:44.790 "data_offset": 2048, 00:22:44.790 "data_size": 63488 00:22:44.790 }, 00:22:44.790 { 00:22:44.790 "name": "BaseBdev2", 00:22:44.790 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:44.790 "is_configured": true, 00:22:44.790 "data_offset": 2048, 00:22:44.790 "data_size": 63488 00:22:44.790 } 00:22:44.790 ] 00:22:44.790 }' 00:22:44.790 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:44.790 13:31:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:45.358 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:45.358 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:45.358 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:45.358 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:45.358 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:45.358 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.358 13:31:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.618 13:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:45.618 "name": "raid_bdev1", 00:22:45.618 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:45.618 "strip_size_kb": 0, 00:22:45.618 "state": "online", 00:22:45.618 "raid_level": "raid1", 00:22:45.618 "superblock": true, 00:22:45.618 "num_base_bdevs": 2, 00:22:45.618 "num_base_bdevs_discovered": 1, 00:22:45.618 "num_base_bdevs_operational": 1, 00:22:45.618 "base_bdevs_list": [ 00:22:45.618 { 00:22:45.618 "name": null, 00:22:45.618 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:45.618 "is_configured": false, 00:22:45.618 "data_offset": 2048, 00:22:45.618 "data_size": 63488 00:22:45.618 }, 00:22:45.618 { 00:22:45.618 "name": "BaseBdev2", 00:22:45.618 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:45.618 "is_configured": true, 00:22:45.618 "data_offset": 2048, 00:22:45.618 "data_size": 63488 00:22:45.618 } 00:22:45.618 ] 00:22:45.618 }' 00:22:45.618 13:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:45.618 13:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:45.618 13:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:45.618 13:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:45.618 13:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:45.877 [2024-07-25 13:31:26.440690] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:45.877 [2024-07-25 13:31:26.444007] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ec590 00:22:45.877 [2024-07-25 13:31:26.445143] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:45.877 13:31:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:46.889 "name": "raid_bdev1", 00:22:46.889 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:46.889 "strip_size_kb": 0, 00:22:46.889 "state": "online", 00:22:46.889 "raid_level": "raid1", 00:22:46.889 "superblock": true, 00:22:46.889 "num_base_bdevs": 2, 00:22:46.889 "num_base_bdevs_discovered": 2, 00:22:46.889 "num_base_bdevs_operational": 2, 00:22:46.889 "process": { 00:22:46.889 "type": "rebuild", 00:22:46.889 "target": "spare", 00:22:46.889 "progress": { 00:22:46.889 "blocks": 22528, 00:22:46.889 "percent": 35 00:22:46.889 } 00:22:46.889 }, 00:22:46.889 "base_bdevs_list": [ 00:22:46.889 { 00:22:46.889 "name": "spare", 00:22:46.889 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:46.889 "is_configured": true, 00:22:46.889 "data_offset": 2048, 00:22:46.889 "data_size": 63488 00:22:46.889 }, 00:22:46.889 { 00:22:46.889 "name": "BaseBdev2", 00:22:46.889 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:46.889 "is_configured": true, 00:22:46.889 "data_offset": 2048, 00:22:46.889 "data_size": 63488 00:22:46.889 } 00:22:46.889 ] 00:22:46.889 }' 00:22:46.889 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:22:47.149 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=744 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.149 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.408 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:47.408 "name": "raid_bdev1", 00:22:47.408 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:47.408 "strip_size_kb": 0, 00:22:47.408 "state": "online", 00:22:47.408 "raid_level": "raid1", 00:22:47.408 "superblock": true, 00:22:47.408 "num_base_bdevs": 2, 00:22:47.408 "num_base_bdevs_discovered": 2, 00:22:47.408 "num_base_bdevs_operational": 2, 00:22:47.409 "process": { 00:22:47.409 "type": "rebuild", 00:22:47.409 "target": "spare", 00:22:47.409 "progress": { 00:22:47.409 "blocks": 28672, 00:22:47.409 "percent": 45 00:22:47.409 } 00:22:47.409 }, 00:22:47.409 "base_bdevs_list": [ 00:22:47.409 { 00:22:47.409 "name": "spare", 00:22:47.409 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:47.409 "is_configured": true, 00:22:47.409 "data_offset": 2048, 00:22:47.409 "data_size": 63488 00:22:47.409 }, 00:22:47.409 { 00:22:47.409 "name": "BaseBdev2", 00:22:47.409 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:47.409 "is_configured": true, 00:22:47.409 "data_offset": 2048, 00:22:47.409 "data_size": 63488 00:22:47.409 } 00:22:47.409 ] 00:22:47.409 }' 00:22:47.409 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:47.409 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:47.409 13:31:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:47.409 13:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:47.409 13:31:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:48.347 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:48.606 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:48.606 "name": "raid_bdev1", 00:22:48.606 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:48.606 "strip_size_kb": 0, 00:22:48.606 "state": "online", 00:22:48.606 "raid_level": "raid1", 00:22:48.606 "superblock": true, 00:22:48.606 "num_base_bdevs": 2, 00:22:48.606 "num_base_bdevs_discovered": 2, 00:22:48.606 "num_base_bdevs_operational": 2, 00:22:48.606 "process": { 00:22:48.606 "type": "rebuild", 00:22:48.606 "target": "spare", 00:22:48.606 "progress": { 00:22:48.606 "blocks": 55296, 00:22:48.606 "percent": 87 00:22:48.606 } 00:22:48.606 }, 00:22:48.606 "base_bdevs_list": [ 00:22:48.606 { 00:22:48.606 "name": "spare", 00:22:48.606 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:48.606 "is_configured": true, 00:22:48.606 "data_offset": 2048, 00:22:48.606 "data_size": 63488 00:22:48.606 }, 00:22:48.606 { 00:22:48.606 "name": "BaseBdev2", 00:22:48.606 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:48.606 "is_configured": true, 00:22:48.606 "data_offset": 2048, 00:22:48.606 "data_size": 63488 00:22:48.606 } 00:22:48.606 ] 00:22:48.606 }' 00:22:48.606 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:48.606 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:48.606 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:48.606 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:48.606 13:31:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:22:48.865 [2024-07-25 13:31:29.563257] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:48.865 [2024-07-25 13:31:29.563299] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:48.865 [2024-07-25 13:31:29.563362] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.804 "name": "raid_bdev1", 00:22:49.804 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:49.804 "strip_size_kb": 0, 00:22:49.804 "state": "online", 00:22:49.804 "raid_level": "raid1", 00:22:49.804 "superblock": true, 00:22:49.804 "num_base_bdevs": 2, 00:22:49.804 "num_base_bdevs_discovered": 2, 00:22:49.804 "num_base_bdevs_operational": 2, 00:22:49.804 "base_bdevs_list": [ 00:22:49.804 { 00:22:49.804 "name": "spare", 00:22:49.804 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:49.804 "is_configured": true, 00:22:49.804 "data_offset": 2048, 00:22:49.804 "data_size": 63488 00:22:49.804 }, 00:22:49.804 { 00:22:49.804 "name": "BaseBdev2", 00:22:49.804 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:49.804 "is_configured": true, 00:22:49.804 "data_offset": 2048, 00:22:49.804 "data_size": 63488 00:22:49.804 } 00:22:49.804 ] 00:22:49.804 }' 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:49.804 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:50.064 "name": "raid_bdev1", 00:22:50.064 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:50.064 "strip_size_kb": 0, 00:22:50.064 "state": "online", 00:22:50.064 "raid_level": "raid1", 00:22:50.064 "superblock": true, 00:22:50.064 "num_base_bdevs": 2, 00:22:50.064 "num_base_bdevs_discovered": 2, 00:22:50.064 "num_base_bdevs_operational": 2, 00:22:50.064 "base_bdevs_list": [ 00:22:50.064 { 00:22:50.064 "name": "spare", 00:22:50.064 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:50.064 "is_configured": true, 00:22:50.064 "data_offset": 2048, 00:22:50.064 "data_size": 63488 00:22:50.064 }, 00:22:50.064 { 00:22:50.064 "name": "BaseBdev2", 00:22:50.064 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:50.064 "is_configured": true, 00:22:50.064 "data_offset": 2048, 00:22:50.064 "data_size": 63488 00:22:50.064 } 00:22:50.064 ] 00:22:50.064 }' 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:50.064 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.324 13:31:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.324 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.324 "name": "raid_bdev1", 00:22:50.324 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:50.324 "strip_size_kb": 0, 00:22:50.324 "state": "online", 00:22:50.324 "raid_level": "raid1", 00:22:50.324 "superblock": true, 00:22:50.324 "num_base_bdevs": 2, 00:22:50.324 "num_base_bdevs_discovered": 2, 00:22:50.324 "num_base_bdevs_operational": 2, 00:22:50.324 "base_bdevs_list": [ 00:22:50.324 { 00:22:50.324 "name": "spare", 00:22:50.324 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:50.324 "is_configured": true, 00:22:50.324 "data_offset": 2048, 00:22:50.324 "data_size": 63488 00:22:50.324 }, 00:22:50.324 { 00:22:50.324 "name": "BaseBdev2", 00:22:50.324 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:50.324 "is_configured": true, 00:22:50.324 "data_offset": 2048, 00:22:50.324 "data_size": 63488 00:22:50.324 } 00:22:50.324 ] 00:22:50.324 }' 00:22:50.324 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.324 13:31:31 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:50.894 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:51.155 [2024-07-25 13:31:31.764711] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:51.155 [2024-07-25 13:31:31.764731] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:51.155 [2024-07-25 13:31:31.764775] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:51.155 [2024-07-25 13:31:31.764814] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:51.155 [2024-07-25 13:31:31.764820] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26edc70 name raid_bdev1, state offline 00:22:51.155 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.155 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.416 13:31:31 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:51.416 /dev/nbd0 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:51.416 1+0 records in 00:22:51.416 1+0 records out 00:22:51.416 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240241 s, 17.0 MB/s 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.416 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:22:51.676 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.676 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:51.676 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:22:51.676 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:51.676 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.676 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:51.676 /dev/nbd1 00:22:51.676 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:51.677 1+0 records in 00:22:51.677 1+0 records out 00:22:51.677 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000289856 s, 14.1 MB/s 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:51.677 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:51.937 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:22:52.198 13:31:32 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:52.458 [2024-07-25 13:31:33.206204] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:52.458 [2024-07-25 13:31:33.206236] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:52.458 [2024-07-25 13:31:33.206249] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26edbb0 00:22:52.458 [2024-07-25 13:31:33.206255] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:52.458 [2024-07-25 13:31:33.207562] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:52.458 [2024-07-25 13:31:33.207585] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:52.458 [2024-07-25 13:31:33.207642] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:52.458 [2024-07-25 13:31:33.207663] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:52.458 [2024-07-25 13:31:33.207742] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:52.458 spare 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.458 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.718 [2024-07-25 13:31:33.308030] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x26f3b60 00:22:52.718 [2024-07-25 13:31:33.308040] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:52.718 [2024-07-25 13:31:33.308182] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26f0de0 00:22:52.718 [2024-07-25 13:31:33.308296] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26f3b60 00:22:52.718 [2024-07-25 13:31:33.308302] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26f3b60 00:22:52.718 [2024-07-25 13:31:33.308378] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:52.718 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:52.718 "name": "raid_bdev1", 00:22:52.718 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:52.718 "strip_size_kb": 0, 00:22:52.718 "state": "online", 00:22:52.718 "raid_level": "raid1", 00:22:52.718 "superblock": true, 00:22:52.718 "num_base_bdevs": 2, 00:22:52.718 "num_base_bdevs_discovered": 2, 00:22:52.718 "num_base_bdevs_operational": 2, 00:22:52.718 "base_bdevs_list": [ 00:22:52.718 { 00:22:52.718 "name": "spare", 00:22:52.718 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:52.718 "is_configured": true, 00:22:52.718 "data_offset": 2048, 00:22:52.718 "data_size": 63488 00:22:52.718 }, 00:22:52.718 { 00:22:52.718 "name": "BaseBdev2", 00:22:52.718 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:52.718 "is_configured": true, 00:22:52.718 "data_offset": 2048, 00:22:52.718 "data_size": 63488 00:22:52.718 } 00:22:52.718 ] 00:22:52.718 }' 00:22:52.718 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:52.718 13:31:33 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:53.286 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:53.286 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:53.286 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:53.286 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:53.286 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:53.286 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.286 13:31:33 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.546 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:53.546 "name": "raid_bdev1", 00:22:53.546 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:53.546 "strip_size_kb": 0, 00:22:53.546 "state": "online", 00:22:53.546 "raid_level": "raid1", 00:22:53.546 "superblock": true, 00:22:53.546 "num_base_bdevs": 2, 00:22:53.546 "num_base_bdevs_discovered": 2, 00:22:53.546 "num_base_bdevs_operational": 2, 00:22:53.546 "base_bdevs_list": [ 00:22:53.546 { 00:22:53.546 "name": "spare", 00:22:53.546 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:53.546 "is_configured": true, 00:22:53.546 "data_offset": 2048, 00:22:53.546 "data_size": 63488 00:22:53.546 }, 00:22:53.546 { 00:22:53.546 "name": "BaseBdev2", 00:22:53.546 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:53.546 "is_configured": true, 00:22:53.546 "data_offset": 2048, 00:22:53.546 "data_size": 63488 00:22:53.546 } 00:22:53.546 ] 00:22:53.546 }' 00:22:53.546 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:53.546 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:53.546 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:53.546 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:53.546 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.546 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:53.807 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:22:53.807 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:54.068 [2024-07-25 13:31:34.641919] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:54.068 "name": "raid_bdev1", 00:22:54.068 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:54.068 "strip_size_kb": 0, 00:22:54.068 "state": "online", 00:22:54.068 "raid_level": "raid1", 00:22:54.068 "superblock": true, 00:22:54.068 "num_base_bdevs": 2, 00:22:54.068 "num_base_bdevs_discovered": 1, 00:22:54.068 "num_base_bdevs_operational": 1, 00:22:54.068 "base_bdevs_list": [ 00:22:54.068 { 00:22:54.068 "name": null, 00:22:54.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:54.068 "is_configured": false, 00:22:54.068 "data_offset": 2048, 00:22:54.068 "data_size": 63488 00:22:54.068 }, 00:22:54.068 { 00:22:54.068 "name": "BaseBdev2", 00:22:54.068 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:54.068 "is_configured": true, 00:22:54.068 "data_offset": 2048, 00:22:54.068 "data_size": 63488 00:22:54.068 } 00:22:54.068 ] 00:22:54.068 }' 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:54.068 13:31:34 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:54.638 13:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:54.898 [2024-07-25 13:31:35.556245] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:54.898 [2024-07-25 13:31:35.556351] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:54.898 [2024-07-25 13:31:35.556360] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:54.898 [2024-07-25 13:31:35.556378] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:54.898 [2024-07-25 13:31:35.559662] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ee490 00:22:54.898 [2024-07-25 13:31:35.560721] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:54.898 13:31:35 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:22:55.838 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:55.838 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:55.838 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:55.838 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:55.838 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:55.838 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.838 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.097 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:56.097 "name": "raid_bdev1", 00:22:56.098 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:56.098 "strip_size_kb": 0, 00:22:56.098 "state": "online", 00:22:56.098 "raid_level": "raid1", 00:22:56.098 "superblock": true, 00:22:56.098 "num_base_bdevs": 2, 00:22:56.098 "num_base_bdevs_discovered": 2, 00:22:56.098 "num_base_bdevs_operational": 2, 00:22:56.098 "process": { 00:22:56.098 "type": "rebuild", 00:22:56.098 "target": "spare", 00:22:56.098 "progress": { 00:22:56.098 "blocks": 22528, 00:22:56.098 "percent": 35 00:22:56.098 } 00:22:56.098 }, 00:22:56.098 "base_bdevs_list": [ 00:22:56.098 { 00:22:56.098 "name": "spare", 00:22:56.098 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:56.098 "is_configured": true, 00:22:56.098 "data_offset": 2048, 00:22:56.098 "data_size": 63488 00:22:56.098 }, 00:22:56.098 { 00:22:56.098 "name": "BaseBdev2", 00:22:56.098 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:56.098 "is_configured": true, 00:22:56.098 "data_offset": 2048, 00:22:56.098 "data_size": 63488 00:22:56.098 } 00:22:56.098 ] 00:22:56.098 }' 00:22:56.098 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:56.098 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:56.098 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:56.098 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:56.098 13:31:36 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:56.357 [2024-07-25 13:31:37.053439] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:56.357 [2024-07-25 13:31:37.069445] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:56.357 [2024-07-25 13:31:37.069473] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:56.357 [2024-07-25 13:31:37.069482] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:56.357 [2024-07-25 13:31:37.069487] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:56.357 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.617 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:56.617 "name": "raid_bdev1", 00:22:56.617 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:56.617 "strip_size_kb": 0, 00:22:56.617 "state": "online", 00:22:56.617 "raid_level": "raid1", 00:22:56.617 "superblock": true, 00:22:56.617 "num_base_bdevs": 2, 00:22:56.617 "num_base_bdevs_discovered": 1, 00:22:56.617 "num_base_bdevs_operational": 1, 00:22:56.617 "base_bdevs_list": [ 00:22:56.617 { 00:22:56.617 "name": null, 00:22:56.617 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:56.617 "is_configured": false, 00:22:56.617 "data_offset": 2048, 00:22:56.617 "data_size": 63488 00:22:56.617 }, 00:22:56.617 { 00:22:56.617 "name": "BaseBdev2", 00:22:56.617 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:56.617 "is_configured": true, 00:22:56.617 "data_offset": 2048, 00:22:56.617 "data_size": 63488 00:22:56.617 } 00:22:56.617 ] 00:22:56.617 }' 00:22:56.617 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:56.617 13:31:37 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:57.187 13:31:37 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:57.447 [2024-07-25 13:31:38.011850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:57.447 [2024-07-25 13:31:38.011882] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:57.447 [2024-07-25 13:31:38.011895] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f10c0 00:22:57.447 [2024-07-25 13:31:38.011902] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:57.447 [2024-07-25 13:31:38.012197] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:57.447 [2024-07-25 13:31:38.012208] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:57.447 [2024-07-25 13:31:38.012265] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:57.447 [2024-07-25 13:31:38.012272] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:57.447 [2024-07-25 13:31:38.012277] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:57.447 [2024-07-25 13:31:38.012288] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:57.447 [2024-07-25 13:31:38.015258] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26ecd50 00:22:57.447 [2024-07-25 13:31:38.016313] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:57.447 spare 00:22:57.447 13:31:38 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:22:58.386 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:58.386 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:58.386 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:58.386 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:58.386 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:58.386 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.386 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.645 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:58.645 "name": "raid_bdev1", 00:22:58.645 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:58.645 "strip_size_kb": 0, 00:22:58.645 "state": "online", 00:22:58.645 "raid_level": "raid1", 00:22:58.645 "superblock": true, 00:22:58.645 "num_base_bdevs": 2, 00:22:58.645 "num_base_bdevs_discovered": 2, 00:22:58.645 "num_base_bdevs_operational": 2, 00:22:58.645 "process": { 00:22:58.645 "type": "rebuild", 00:22:58.645 "target": "spare", 00:22:58.645 "progress": { 00:22:58.645 "blocks": 22528, 00:22:58.645 "percent": 35 00:22:58.645 } 00:22:58.645 }, 00:22:58.645 "base_bdevs_list": [ 00:22:58.645 { 00:22:58.645 "name": "spare", 00:22:58.645 "uuid": "ecb82913-be97-5da1-9688-32e9f3a0cc3f", 00:22:58.645 "is_configured": true, 00:22:58.645 "data_offset": 2048, 00:22:58.645 "data_size": 63488 00:22:58.645 }, 00:22:58.645 { 00:22:58.645 "name": "BaseBdev2", 00:22:58.645 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:58.645 "is_configured": true, 00:22:58.645 "data_offset": 2048, 00:22:58.645 "data_size": 63488 00:22:58.645 } 00:22:58.645 ] 00:22:58.645 }' 00:22:58.645 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:58.645 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:58.645 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:58.645 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:58.645 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:58.906 [2024-07-25 13:31:39.476861] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:58.906 [2024-07-25 13:31:39.524951] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:58.906 [2024-07-25 13:31:39.524985] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:58.906 [2024-07-25 13:31:39.524994] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:58.906 [2024-07-25 13:31:39.524998] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:58.906 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.166 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:59.166 "name": "raid_bdev1", 00:22:59.166 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:59.166 "strip_size_kb": 0, 00:22:59.166 "state": "online", 00:22:59.166 "raid_level": "raid1", 00:22:59.166 "superblock": true, 00:22:59.166 "num_base_bdevs": 2, 00:22:59.166 "num_base_bdevs_discovered": 1, 00:22:59.166 "num_base_bdevs_operational": 1, 00:22:59.166 "base_bdevs_list": [ 00:22:59.166 { 00:22:59.166 "name": null, 00:22:59.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.166 "is_configured": false, 00:22:59.166 "data_offset": 2048, 00:22:59.166 "data_size": 63488 00:22:59.166 }, 00:22:59.166 { 00:22:59.166 "name": "BaseBdev2", 00:22:59.166 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:59.166 "is_configured": true, 00:22:59.166 "data_offset": 2048, 00:22:59.166 "data_size": 63488 00:22:59.166 } 00:22:59.166 ] 00:22:59.166 }' 00:22:59.166 13:31:39 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:59.166 13:31:39 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:59.737 "name": "raid_bdev1", 00:22:59.737 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:22:59.737 "strip_size_kb": 0, 00:22:59.737 "state": "online", 00:22:59.737 "raid_level": "raid1", 00:22:59.737 "superblock": true, 00:22:59.737 "num_base_bdevs": 2, 00:22:59.737 "num_base_bdevs_discovered": 1, 00:22:59.737 "num_base_bdevs_operational": 1, 00:22:59.737 "base_bdevs_list": [ 00:22:59.737 { 00:22:59.737 "name": null, 00:22:59.737 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:59.737 "is_configured": false, 00:22:59.737 "data_offset": 2048, 00:22:59.737 "data_size": 63488 00:22:59.737 }, 00:22:59.737 { 00:22:59.737 "name": "BaseBdev2", 00:22:59.737 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:22:59.737 "is_configured": true, 00:22:59.737 "data_offset": 2048, 00:22:59.737 "data_size": 63488 00:22:59.737 } 00:22:59.737 ] 00:22:59.737 }' 00:22:59.737 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:59.996 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:59.996 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:59.996 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:59.996 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:59.996 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:00.256 [2024-07-25 13:31:40.940560] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:00.256 [2024-07-25 13:31:40.940592] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:00.256 [2024-07-25 13:31:40.940605] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26f4f40 00:23:00.256 [2024-07-25 13:31:40.940611] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:00.256 [2024-07-25 13:31:40.940888] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:00.256 [2024-07-25 13:31:40.940901] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:00.256 [2024-07-25 13:31:40.940945] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:00.256 [2024-07-25 13:31:40.940953] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:00.256 [2024-07-25 13:31:40.940959] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:00.256 BaseBdev1 00:23:00.256 13:31:40 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.193 13:31:41 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.451 13:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.451 "name": "raid_bdev1", 00:23:01.451 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:23:01.451 "strip_size_kb": 0, 00:23:01.451 "state": "online", 00:23:01.451 "raid_level": "raid1", 00:23:01.451 "superblock": true, 00:23:01.451 "num_base_bdevs": 2, 00:23:01.451 "num_base_bdevs_discovered": 1, 00:23:01.451 "num_base_bdevs_operational": 1, 00:23:01.451 "base_bdevs_list": [ 00:23:01.451 { 00:23:01.451 "name": null, 00:23:01.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:01.451 "is_configured": false, 00:23:01.451 "data_offset": 2048, 00:23:01.451 "data_size": 63488 00:23:01.451 }, 00:23:01.451 { 00:23:01.451 "name": "BaseBdev2", 00:23:01.451 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:23:01.451 "is_configured": true, 00:23:01.451 "data_offset": 2048, 00:23:01.451 "data_size": 63488 00:23:01.451 } 00:23:01.451 ] 00:23:01.451 }' 00:23:01.451 13:31:42 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.451 13:31:42 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:02.388 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:02.388 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:02.388 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:02.388 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:02.388 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:02.388 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.388 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.959 "name": "raid_bdev1", 00:23:02.959 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:23:02.959 "strip_size_kb": 0, 00:23:02.959 "state": "online", 00:23:02.959 "raid_level": "raid1", 00:23:02.959 "superblock": true, 00:23:02.959 "num_base_bdevs": 2, 00:23:02.959 "num_base_bdevs_discovered": 1, 00:23:02.959 "num_base_bdevs_operational": 1, 00:23:02.959 "base_bdevs_list": [ 00:23:02.959 { 00:23:02.959 "name": null, 00:23:02.959 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.959 "is_configured": false, 00:23:02.959 "data_offset": 2048, 00:23:02.959 "data_size": 63488 00:23:02.959 }, 00:23:02.959 { 00:23:02.959 "name": "BaseBdev2", 00:23:02.959 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:23:02.959 "is_configured": true, 00:23:02.959 "data_offset": 2048, 00:23:02.959 "data_size": 63488 00:23:02.959 } 00:23:02.959 ] 00:23:02.959 }' 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:02.959 13:31:43 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:03.527 [2024-07-25 13:31:44.224962] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:03.527 [2024-07-25 13:31:44.225061] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:03.527 [2024-07-25 13:31:44.225070] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:03.527 request: 00:23:03.527 { 00:23:03.527 "base_bdev": "BaseBdev1", 00:23:03.527 "raid_bdev": "raid_bdev1", 00:23:03.527 "method": "bdev_raid_add_base_bdev", 00:23:03.527 "req_id": 1 00:23:03.527 } 00:23:03.527 Got JSON-RPC error response 00:23:03.527 response: 00:23:03.527 { 00:23:03.527 "code": -22, 00:23:03.527 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:03.527 } 00:23:03.527 13:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:23:03.527 13:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:23:03.527 13:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:23:03.527 13:31:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:23:03.527 13:31:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:04.907 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.166 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.166 "name": "raid_bdev1", 00:23:05.166 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:23:05.166 "strip_size_kb": 0, 00:23:05.166 "state": "online", 00:23:05.166 "raid_level": "raid1", 00:23:05.166 "superblock": true, 00:23:05.166 "num_base_bdevs": 2, 00:23:05.166 "num_base_bdevs_discovered": 1, 00:23:05.166 "num_base_bdevs_operational": 1, 00:23:05.166 "base_bdevs_list": [ 00:23:05.166 { 00:23:05.166 "name": null, 00:23:05.166 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.166 "is_configured": false, 00:23:05.166 "data_offset": 2048, 00:23:05.166 "data_size": 63488 00:23:05.166 }, 00:23:05.166 { 00:23:05.166 "name": "BaseBdev2", 00:23:05.166 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:23:05.166 "is_configured": true, 00:23:05.166 "data_offset": 2048, 00:23:05.166 "data_size": 63488 00:23:05.166 } 00:23:05.166 ] 00:23:05.166 }' 00:23:05.166 13:31:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.166 13:31:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:05.735 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:05.735 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:05.735 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:05.735 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:05.735 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:05.735 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.735 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.994 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.994 "name": "raid_bdev1", 00:23:05.994 "uuid": "88f7593a-faef-4ce7-a2d6-f23392091098", 00:23:05.994 "strip_size_kb": 0, 00:23:05.994 "state": "online", 00:23:05.994 "raid_level": "raid1", 00:23:05.994 "superblock": true, 00:23:05.994 "num_base_bdevs": 2, 00:23:05.994 "num_base_bdevs_discovered": 1, 00:23:05.994 "num_base_bdevs_operational": 1, 00:23:05.994 "base_bdevs_list": [ 00:23:05.994 { 00:23:05.994 "name": null, 00:23:05.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.994 "is_configured": false, 00:23:05.994 "data_offset": 2048, 00:23:05.994 "data_size": 63488 00:23:05.994 }, 00:23:05.994 { 00:23:05.994 "name": "BaseBdev2", 00:23:05.994 "uuid": "7a6f156e-3466-5051-9368-d25b24c84a35", 00:23:05.994 "is_configured": true, 00:23:05.994 "data_offset": 2048, 00:23:05.994 "data_size": 63488 00:23:05.994 } 00:23:05.994 ] 00:23:05.994 }' 00:23:05.994 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:05.994 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:05.994 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:05.994 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 999282 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 999282 ']' 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 999282 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 999282 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 999282' 00:23:05.995 killing process with pid 999282 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 999282 00:23:05.995 Received shutdown signal, test time was about 60.000000 seconds 00:23:05.995 00:23:05.995 Latency(us) 00:23:05.995 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:05.995 =================================================================================================================== 00:23:05.995 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:05.995 [2024-07-25 13:31:46.749847] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:05.995 [2024-07-25 13:31:46.749911] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:05.995 [2024-07-25 13:31:46.749941] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:05.995 [2024-07-25 13:31:46.749947] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26f3b60 name raid_bdev1, state offline 00:23:05.995 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 999282 00:23:05.995 [2024-07-25 13:31:46.764892] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:23:06.256 00:23:06.256 real 0m35.578s 00:23:06.256 user 0m51.038s 00:23:06.256 sys 0m5.514s 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:06.256 ************************************ 00:23:06.256 END TEST raid_rebuild_test_sb 00:23:06.256 ************************************ 00:23:06.256 13:31:46 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:23:06.256 13:31:46 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:23:06.256 13:31:46 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:06.256 13:31:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:06.256 ************************************ 00:23:06.256 START TEST raid_rebuild_test_io 00:23:06.256 ************************************ 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 false true true 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1005569 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1005569 /var/tmp/spdk-raid.sock 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1005569 ']' 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:06.256 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:06.256 13:31:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:06.256 [2024-07-25 13:31:47.026452] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:23:06.256 [2024-07-25 13:31:47.026499] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1005569 ] 00:23:06.257 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:06.257 Zero copy mechanism will not be used. 00:23:06.516 [2024-07-25 13:31:47.115746] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:06.516 [2024-07-25 13:31:47.180382] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:06.516 [2024-07-25 13:31:47.219555] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:06.516 [2024-07-25 13:31:47.219582] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:07.086 13:31:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:07.086 13:31:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:23:07.086 13:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:07.086 13:31:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:07.346 BaseBdev1_malloc 00:23:07.346 13:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:07.606 [2024-07-25 13:31:48.229987] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:07.606 [2024-07-25 13:31:48.230022] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.606 [2024-07-25 13:31:48.230035] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf01d10 00:23:07.606 [2024-07-25 13:31:48.230041] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.606 [2024-07-25 13:31:48.231329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.606 [2024-07-25 13:31:48.231349] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:07.606 BaseBdev1 00:23:07.606 13:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:07.606 13:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:07.865 BaseBdev2_malloc 00:23:07.865 13:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:07.865 [2024-07-25 13:31:48.612964] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:07.865 [2024-07-25 13:31:48.612991] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:07.865 [2024-07-25 13:31:48.613005] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf026d0 00:23:07.865 [2024-07-25 13:31:48.613011] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:07.865 [2024-07-25 13:31:48.614210] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:07.865 [2024-07-25 13:31:48.614230] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:07.865 BaseBdev2 00:23:07.865 13:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:08.125 spare_malloc 00:23:08.125 13:31:48 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:08.385 spare_delay 00:23:08.385 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:08.646 [2024-07-25 13:31:49.180285] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:08.646 [2024-07-25 13:31:49.180314] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:08.646 [2024-07-25 13:31:49.180327] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xef9ac0 00:23:08.646 [2024-07-25 13:31:49.180333] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:08.646 [2024-07-25 13:31:49.181534] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:08.646 [2024-07-25 13:31:49.181560] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:08.646 spare 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:08.646 [2024-07-25 13:31:49.372783] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:08.646 [2024-07-25 13:31:49.373786] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:08.646 [2024-07-25 13:31:49.373844] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xefac70 00:23:08.646 [2024-07-25 13:31:49.373850] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:08.646 [2024-07-25 13:31:49.374005] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf02960 00:23:08.646 [2024-07-25 13:31:49.374109] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xefac70 00:23:08.646 [2024-07-25 13:31:49.374114] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xefac70 00:23:08.646 [2024-07-25 13:31:49.374194] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.646 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.906 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.906 "name": "raid_bdev1", 00:23:08.906 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:08.906 "strip_size_kb": 0, 00:23:08.906 "state": "online", 00:23:08.906 "raid_level": "raid1", 00:23:08.906 "superblock": false, 00:23:08.906 "num_base_bdevs": 2, 00:23:08.906 "num_base_bdevs_discovered": 2, 00:23:08.906 "num_base_bdevs_operational": 2, 00:23:08.906 "base_bdevs_list": [ 00:23:08.906 { 00:23:08.906 "name": "BaseBdev1", 00:23:08.906 "uuid": "c4d0381b-e79a-5e56-91ca-6cddf3d12c25", 00:23:08.906 "is_configured": true, 00:23:08.906 "data_offset": 0, 00:23:08.906 "data_size": 65536 00:23:08.906 }, 00:23:08.906 { 00:23:08.906 "name": "BaseBdev2", 00:23:08.906 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:08.906 "is_configured": true, 00:23:08.906 "data_offset": 0, 00:23:08.906 "data_size": 65536 00:23:08.906 } 00:23:08.906 ] 00:23:08.906 }' 00:23:08.906 13:31:49 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.906 13:31:49 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:09.476 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:09.476 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:23:09.736 [2024-07-25 13:31:50.371502] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:09.736 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:23:09.736 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:09.736 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:09.996 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:23:09.996 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:23:09.996 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:09.996 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:09.996 [2024-07-25 13:31:50.681494] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xef8dc0 00:23:09.996 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:09.996 Zero copy mechanism will not be used. 00:23:09.996 Running I/O for 60 seconds... 00:23:09.996 [2024-07-25 13:31:50.778084] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:09.996 [2024-07-25 13:31:50.784528] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xef8dc0 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.257 13:31:50 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:10.517 13:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:10.517 "name": "raid_bdev1", 00:23:10.517 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:10.517 "strip_size_kb": 0, 00:23:10.517 "state": "online", 00:23:10.517 "raid_level": "raid1", 00:23:10.517 "superblock": false, 00:23:10.517 "num_base_bdevs": 2, 00:23:10.517 "num_base_bdevs_discovered": 1, 00:23:10.517 "num_base_bdevs_operational": 1, 00:23:10.517 "base_bdevs_list": [ 00:23:10.517 { 00:23:10.517 "name": null, 00:23:10.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:10.517 "is_configured": false, 00:23:10.517 "data_offset": 0, 00:23:10.517 "data_size": 65536 00:23:10.517 }, 00:23:10.517 { 00:23:10.517 "name": "BaseBdev2", 00:23:10.517 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:10.517 "is_configured": true, 00:23:10.517 "data_offset": 0, 00:23:10.517 "data_size": 65536 00:23:10.517 } 00:23:10.517 ] 00:23:10.517 }' 00:23:10.517 13:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:10.517 13:31:51 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:11.457 13:31:51 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:11.457 [2024-07-25 13:31:52.161772] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:11.457 [2024-07-25 13:31:52.212926] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xeff920 00:23:11.457 [2024-07-25 13:31:52.214532] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:11.457 13:31:52 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:11.716 [2024-07-25 13:31:52.334459] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:11.716 [2024-07-25 13:31:52.334676] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:11.976 [2024-07-25 13:31:52.543370] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:11.976 [2024-07-25 13:31:52.543476] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:12.236 [2024-07-25 13:31:52.899388] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:12.497 [2024-07-25 13:31:53.129976] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:12.497 [2024-07-25 13:31:53.130227] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:12.497 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:12.497 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:12.497 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:12.497 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:12.497 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:12.497 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:12.497 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:12.756 [2024-07-25 13:31:53.351788] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:12.756 [2024-07-25 13:31:53.351929] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:12.756 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:12.756 "name": "raid_bdev1", 00:23:12.756 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:12.756 "strip_size_kb": 0, 00:23:12.756 "state": "online", 00:23:12.756 "raid_level": "raid1", 00:23:12.756 "superblock": false, 00:23:12.756 "num_base_bdevs": 2, 00:23:12.756 "num_base_bdevs_discovered": 2, 00:23:12.756 "num_base_bdevs_operational": 2, 00:23:12.756 "process": { 00:23:12.756 "type": "rebuild", 00:23:12.756 "target": "spare", 00:23:12.756 "progress": { 00:23:12.756 "blocks": 16384, 00:23:12.756 "percent": 25 00:23:12.756 } 00:23:12.756 }, 00:23:12.756 "base_bdevs_list": [ 00:23:12.756 { 00:23:12.756 "name": "spare", 00:23:12.756 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:12.756 "is_configured": true, 00:23:12.756 "data_offset": 0, 00:23:12.756 "data_size": 65536 00:23:12.756 }, 00:23:12.756 { 00:23:12.756 "name": "BaseBdev2", 00:23:12.756 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:12.756 "is_configured": true, 00:23:12.756 "data_offset": 0, 00:23:12.756 "data_size": 65536 00:23:12.756 } 00:23:12.756 ] 00:23:12.756 }' 00:23:12.756 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:12.756 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:12.756 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:12.756 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:12.756 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:13.015 [2024-07-25 13:31:53.666116] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:13.015 [2024-07-25 13:31:53.695796] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:13.015 [2024-07-25 13:31:53.696051] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:13.015 [2024-07-25 13:31:53.796998] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:13.015 [2024-07-25 13:31:53.804614] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:13.015 [2024-07-25 13:31:53.804631] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:13.015 [2024-07-25 13:31:53.804637] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:13.276 [2024-07-25 13:31:53.827382] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xef8dc0 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.276 13:31:53 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.276 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.276 "name": "raid_bdev1", 00:23:13.276 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:13.276 "strip_size_kb": 0, 00:23:13.276 "state": "online", 00:23:13.276 "raid_level": "raid1", 00:23:13.276 "superblock": false, 00:23:13.276 "num_base_bdevs": 2, 00:23:13.276 "num_base_bdevs_discovered": 1, 00:23:13.276 "num_base_bdevs_operational": 1, 00:23:13.276 "base_bdevs_list": [ 00:23:13.276 { 00:23:13.276 "name": null, 00:23:13.276 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.276 "is_configured": false, 00:23:13.276 "data_offset": 0, 00:23:13.276 "data_size": 65536 00:23:13.276 }, 00:23:13.276 { 00:23:13.276 "name": "BaseBdev2", 00:23:13.276 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:13.276 "is_configured": true, 00:23:13.276 "data_offset": 0, 00:23:13.276 "data_size": 65536 00:23:13.276 } 00:23:13.276 ] 00:23:13.276 }' 00:23:13.276 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.276 13:31:54 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:13.848 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:13.848 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:13.848 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:13.848 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:13.848 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:13.848 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.848 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.108 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.108 "name": "raid_bdev1", 00:23:14.108 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:14.108 "strip_size_kb": 0, 00:23:14.108 "state": "online", 00:23:14.108 "raid_level": "raid1", 00:23:14.108 "superblock": false, 00:23:14.108 "num_base_bdevs": 2, 00:23:14.108 "num_base_bdevs_discovered": 1, 00:23:14.108 "num_base_bdevs_operational": 1, 00:23:14.108 "base_bdevs_list": [ 00:23:14.108 { 00:23:14.108 "name": null, 00:23:14.108 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.108 "is_configured": false, 00:23:14.108 "data_offset": 0, 00:23:14.108 "data_size": 65536 00:23:14.108 }, 00:23:14.108 { 00:23:14.108 "name": "BaseBdev2", 00:23:14.108 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:14.108 "is_configured": true, 00:23:14.108 "data_offset": 0, 00:23:14.108 "data_size": 65536 00:23:14.108 } 00:23:14.108 ] 00:23:14.108 }' 00:23:14.108 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.108 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:14.108 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.108 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:14.108 13:31:54 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:14.369 [2024-07-25 13:31:55.085950] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:14.369 13:31:55 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:23:14.369 [2024-07-25 13:31:55.118326] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xefaf60 00:23:14.369 [2024-07-25 13:31:55.119455] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:14.629 [2024-07-25 13:31:55.245336] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:14.629 [2024-07-25 13:31:55.379912] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:14.629 [2024-07-25 13:31:55.380059] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:15.198 [2024-07-25 13:31:55.819807] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:15.458 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:15.458 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.458 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:15.458 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:15.458 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.458 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.458 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.458 [2024-07-25 13:31:56.162810] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:15.718 [2024-07-25 13:31:56.289487] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:15.718 "name": "raid_bdev1", 00:23:15.718 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:15.718 "strip_size_kb": 0, 00:23:15.718 "state": "online", 00:23:15.718 "raid_level": "raid1", 00:23:15.718 "superblock": false, 00:23:15.718 "num_base_bdevs": 2, 00:23:15.718 "num_base_bdevs_discovered": 2, 00:23:15.718 "num_base_bdevs_operational": 2, 00:23:15.718 "process": { 00:23:15.718 "type": "rebuild", 00:23:15.718 "target": "spare", 00:23:15.718 "progress": { 00:23:15.718 "blocks": 16384, 00:23:15.718 "percent": 25 00:23:15.718 } 00:23:15.718 }, 00:23:15.718 "base_bdevs_list": [ 00:23:15.718 { 00:23:15.718 "name": "spare", 00:23:15.718 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:15.718 "is_configured": true, 00:23:15.718 "data_offset": 0, 00:23:15.718 "data_size": 65536 00:23:15.718 }, 00:23:15.718 { 00:23:15.718 "name": "BaseBdev2", 00:23:15.718 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:15.718 "is_configured": true, 00:23:15.718 "data_offset": 0, 00:23:15.718 "data_size": 65536 00:23:15.718 } 00:23:15.718 ] 00:23:15.718 }' 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=773 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:15.718 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:15.978 [2024-07-25 13:31:56.611678] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:15.978 [2024-07-25 13:31:56.611946] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:15.978 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:15.978 "name": "raid_bdev1", 00:23:15.978 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:15.978 "strip_size_kb": 0, 00:23:15.978 "state": "online", 00:23:15.978 "raid_level": "raid1", 00:23:15.978 "superblock": false, 00:23:15.978 "num_base_bdevs": 2, 00:23:15.978 "num_base_bdevs_discovered": 2, 00:23:15.978 "num_base_bdevs_operational": 2, 00:23:15.978 "process": { 00:23:15.978 "type": "rebuild", 00:23:15.978 "target": "spare", 00:23:15.978 "progress": { 00:23:15.978 "blocks": 20480, 00:23:15.978 "percent": 31 00:23:15.978 } 00:23:15.978 }, 00:23:15.978 "base_bdevs_list": [ 00:23:15.978 { 00:23:15.978 "name": "spare", 00:23:15.978 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:15.978 "is_configured": true, 00:23:15.978 "data_offset": 0, 00:23:15.978 "data_size": 65536 00:23:15.978 }, 00:23:15.978 { 00:23:15.978 "name": "BaseBdev2", 00:23:15.978 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:15.978 "is_configured": true, 00:23:15.978 "data_offset": 0, 00:23:15.978 "data_size": 65536 00:23:15.978 } 00:23:15.978 ] 00:23:15.978 }' 00:23:15.978 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:16.238 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:16.238 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:16.238 [2024-07-25 13:31:56.847115] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:16.238 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:16.238 13:31:56 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:16.497 [2024-07-25 13:31:57.163971] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:17.438 13:31:57 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:17.438 [2024-07-25 13:31:58.006566] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:23:17.438 13:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:17.438 "name": "raid_bdev1", 00:23:17.438 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:17.438 "strip_size_kb": 0, 00:23:17.438 "state": "online", 00:23:17.438 "raid_level": "raid1", 00:23:17.438 "superblock": false, 00:23:17.438 "num_base_bdevs": 2, 00:23:17.438 "num_base_bdevs_discovered": 2, 00:23:17.438 "num_base_bdevs_operational": 2, 00:23:17.438 "process": { 00:23:17.438 "type": "rebuild", 00:23:17.438 "target": "spare", 00:23:17.438 "progress": { 00:23:17.438 "blocks": 40960, 00:23:17.438 "percent": 62 00:23:17.438 } 00:23:17.438 }, 00:23:17.438 "base_bdevs_list": [ 00:23:17.438 { 00:23:17.438 "name": "spare", 00:23:17.438 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:17.438 "is_configured": true, 00:23:17.438 "data_offset": 0, 00:23:17.438 "data_size": 65536 00:23:17.438 }, 00:23:17.438 { 00:23:17.438 "name": "BaseBdev2", 00:23:17.438 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:17.438 "is_configured": true, 00:23:17.438 "data_offset": 0, 00:23:17.438 "data_size": 65536 00:23:17.438 } 00:23:17.438 ] 00:23:17.438 }' 00:23:17.438 13:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:17.438 13:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:17.438 13:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:17.697 13:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:17.697 13:31:58 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:17.697 [2024-07-25 13:31:58.328912] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:23:17.956 [2024-07-25 13:31:58.652107] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.526 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.786 [2024-07-25 13:31:59.414339] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:18.786 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:18.786 "name": "raid_bdev1", 00:23:18.786 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:18.786 "strip_size_kb": 0, 00:23:18.786 "state": "online", 00:23:18.786 "raid_level": "raid1", 00:23:18.786 "superblock": false, 00:23:18.786 "num_base_bdevs": 2, 00:23:18.786 "num_base_bdevs_discovered": 2, 00:23:18.786 "num_base_bdevs_operational": 2, 00:23:18.786 "process": { 00:23:18.786 "type": "rebuild", 00:23:18.786 "target": "spare", 00:23:18.786 "progress": { 00:23:18.786 "blocks": 65536, 00:23:18.786 "percent": 100 00:23:18.786 } 00:23:18.786 }, 00:23:18.786 "base_bdevs_list": [ 00:23:18.786 { 00:23:18.786 "name": "spare", 00:23:18.786 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:18.786 "is_configured": true, 00:23:18.786 "data_offset": 0, 00:23:18.786 "data_size": 65536 00:23:18.786 }, 00:23:18.786 { 00:23:18.786 "name": "BaseBdev2", 00:23:18.786 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:18.786 "is_configured": true, 00:23:18.786 "data_offset": 0, 00:23:18.786 "data_size": 65536 00:23:18.786 } 00:23:18.786 ] 00:23:18.786 }' 00:23:18.786 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:18.786 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:18.786 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:18.786 [2024-07-25 13:31:59.514578] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:18.786 [2024-07-25 13:31:59.515613] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:18.786 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:18.786 13:31:59 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.166 13:32:00 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.426 "name": "raid_bdev1", 00:23:20.426 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:20.426 "strip_size_kb": 0, 00:23:20.426 "state": "online", 00:23:20.426 "raid_level": "raid1", 00:23:20.426 "superblock": false, 00:23:20.426 "num_base_bdevs": 2, 00:23:20.426 "num_base_bdevs_discovered": 2, 00:23:20.426 "num_base_bdevs_operational": 2, 00:23:20.426 "base_bdevs_list": [ 00:23:20.426 { 00:23:20.426 "name": "spare", 00:23:20.426 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:20.426 "is_configured": true, 00:23:20.426 "data_offset": 0, 00:23:20.426 "data_size": 65536 00:23:20.426 }, 00:23:20.426 { 00:23:20.426 "name": "BaseBdev2", 00:23:20.426 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:20.426 "is_configured": true, 00:23:20.426 "data_offset": 0, 00:23:20.426 "data_size": 65536 00:23:20.426 } 00:23:20.426 ] 00:23:20.426 }' 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.426 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.685 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:20.685 "name": "raid_bdev1", 00:23:20.685 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:20.685 "strip_size_kb": 0, 00:23:20.685 "state": "online", 00:23:20.685 "raid_level": "raid1", 00:23:20.685 "superblock": false, 00:23:20.685 "num_base_bdevs": 2, 00:23:20.685 "num_base_bdevs_discovered": 2, 00:23:20.685 "num_base_bdevs_operational": 2, 00:23:20.685 "base_bdevs_list": [ 00:23:20.685 { 00:23:20.685 "name": "spare", 00:23:20.685 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:20.685 "is_configured": true, 00:23:20.685 "data_offset": 0, 00:23:20.685 "data_size": 65536 00:23:20.685 }, 00:23:20.685 { 00:23:20.685 "name": "BaseBdev2", 00:23:20.685 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:20.685 "is_configured": true, 00:23:20.685 "data_offset": 0, 00:23:20.685 "data_size": 65536 00:23:20.685 } 00:23:20.685 ] 00:23:20.685 }' 00:23:20.685 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:20.685 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:20.685 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:20.945 "name": "raid_bdev1", 00:23:20.945 "uuid": "84346cfd-9d4a-4f21-af08-37bed943aba7", 00:23:20.945 "strip_size_kb": 0, 00:23:20.945 "state": "online", 00:23:20.945 "raid_level": "raid1", 00:23:20.945 "superblock": false, 00:23:20.945 "num_base_bdevs": 2, 00:23:20.945 "num_base_bdevs_discovered": 2, 00:23:20.945 "num_base_bdevs_operational": 2, 00:23:20.945 "base_bdevs_list": [ 00:23:20.945 { 00:23:20.945 "name": "spare", 00:23:20.945 "uuid": "3e1ac541-045c-5464-b336-2e1845e623ef", 00:23:20.945 "is_configured": true, 00:23:20.945 "data_offset": 0, 00:23:20.945 "data_size": 65536 00:23:20.945 }, 00:23:20.945 { 00:23:20.945 "name": "BaseBdev2", 00:23:20.945 "uuid": "46bf2b5a-4e2a-5da1-8d7d-274f8099079d", 00:23:20.945 "is_configured": true, 00:23:20.945 "data_offset": 0, 00:23:20.945 "data_size": 65536 00:23:20.945 } 00:23:20.945 ] 00:23:20.945 }' 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:20.945 13:32:01 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:21.514 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:21.832 [2024-07-25 13:32:02.407317] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:21.832 [2024-07-25 13:32:02.407339] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:21.832 00:23:21.832 Latency(us) 00:23:21.832 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:21.832 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:21.832 raid_bdev1 : 11.71 118.50 355.51 0.00 0.00 11873.14 245.76 114536.76 00:23:21.832 =================================================================================================================== 00:23:21.832 Total : 118.50 355.51 0.00 0.00 11873.14 245.76 114536.76 00:23:21.832 [2024-07-25 13:32:02.426543] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:21.832 [2024-07-25 13:32:02.426571] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:21.832 [2024-07-25 13:32:02.426624] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:21.832 [2024-07-25 13:32:02.426630] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xefac70 name raid_bdev1, state offline 00:23:21.832 0 00:23:21.833 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:21.833 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:22.143 /dev/nbd0 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:22.143 1+0 records in 00:23:22.143 1+0 records out 00:23:22.143 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281681 s, 14.5 MB/s 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:22.143 13:32:02 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:22.403 /dev/nbd1 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:22.403 1+0 records in 00:23:22.403 1+0 records out 00:23:22.403 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000287159 s, 14.3 MB/s 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:22.403 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:22.662 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:22.663 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:22.663 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:22.663 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:22.663 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:23:22.663 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:22.663 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1005569 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1005569 ']' 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1005569 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1005569 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1005569' 00:23:22.922 killing process with pid 1005569 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1005569 00:23:22.922 Received shutdown signal, test time was about 12.920876 seconds 00:23:22.922 00:23:22.922 Latency(us) 00:23:22.922 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:22.922 =================================================================================================================== 00:23:22.922 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:22.922 [2024-07-25 13:32:03.634418] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:22.922 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1005569 00:23:22.922 [2024-07-25 13:32:03.645917] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:23:23.182 00:23:23.182 real 0m16.805s 00:23:23.182 user 0m26.386s 00:23:23.182 sys 0m1.919s 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:23:23.182 ************************************ 00:23:23.182 END TEST raid_rebuild_test_io 00:23:23.182 ************************************ 00:23:23.182 13:32:03 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:23:23.182 13:32:03 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:23:23.182 13:32:03 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:23.182 13:32:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:23.182 ************************************ 00:23:23.182 START TEST raid_rebuild_test_sb_io 00:23:23.182 ************************************ 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true true true 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1008588 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1008588 /var/tmp/spdk-raid.sock 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1008588 ']' 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:23.182 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:23.182 13:32:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:23.182 [2024-07-25 13:32:03.913815] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:23:23.182 [2024-07-25 13:32:03.913869] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1008588 ] 00:23:23.182 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:23.182 Zero copy mechanism will not be used. 00:23:23.442 [2024-07-25 13:32:04.004494] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:23.443 [2024-07-25 13:32:04.081109] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:23.443 [2024-07-25 13:32:04.128648] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:23.443 [2024-07-25 13:32:04.128675] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:24.012 13:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:24.012 13:32:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:23:24.012 13:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:24.012 13:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:24.272 BaseBdev1_malloc 00:23:24.272 13:32:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:24.532 [2024-07-25 13:32:05.108889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:24.532 [2024-07-25 13:32:05.108922] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.532 [2024-07-25 13:32:05.108936] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21acd10 00:23:24.532 [2024-07-25 13:32:05.108942] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.532 [2024-07-25 13:32:05.110201] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.532 [2024-07-25 13:32:05.110221] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:24.532 BaseBdev1 00:23:24.532 13:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:24.532 13:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:24.532 BaseBdev2_malloc 00:23:24.532 13:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:24.792 [2024-07-25 13:32:05.479807] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:24.792 [2024-07-25 13:32:05.479836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:24.792 [2024-07-25 13:32:05.479848] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21ad6d0 00:23:24.792 [2024-07-25 13:32:05.479854] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:24.792 [2024-07-25 13:32:05.481018] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:24.792 [2024-07-25 13:32:05.481037] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:24.792 BaseBdev2 00:23:24.793 13:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:25.053 spare_malloc 00:23:25.053 13:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:25.314 spare_delay 00:23:25.314 13:32:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:25.314 [2024-07-25 13:32:06.034946] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:25.314 [2024-07-25 13:32:06.034974] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:25.314 [2024-07-25 13:32:06.034986] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a4ac0 00:23:25.314 [2024-07-25 13:32:06.034992] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:25.314 [2024-07-25 13:32:06.036191] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:25.314 [2024-07-25 13:32:06.036214] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:25.314 spare 00:23:25.314 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:23:25.574 [2024-07-25 13:32:06.223440] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:25.574 [2024-07-25 13:32:06.224440] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:25.574 [2024-07-25 13:32:06.224543] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x21a5c70 00:23:25.574 [2024-07-25 13:32:06.224557] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:25.574 [2024-07-25 13:32:06.224711] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ad960 00:23:25.574 [2024-07-25 13:32:06.224818] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21a5c70 00:23:25.574 [2024-07-25 13:32:06.224824] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21a5c70 00:23:25.574 [2024-07-25 13:32:06.224903] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:25.574 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:25.833 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:25.833 "name": "raid_bdev1", 00:23:25.833 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:25.833 "strip_size_kb": 0, 00:23:25.833 "state": "online", 00:23:25.833 "raid_level": "raid1", 00:23:25.833 "superblock": true, 00:23:25.833 "num_base_bdevs": 2, 00:23:25.833 "num_base_bdevs_discovered": 2, 00:23:25.833 "num_base_bdevs_operational": 2, 00:23:25.833 "base_bdevs_list": [ 00:23:25.833 { 00:23:25.833 "name": "BaseBdev1", 00:23:25.833 "uuid": "f9901652-2c86-5f79-8810-2925447ce7a2", 00:23:25.833 "is_configured": true, 00:23:25.833 "data_offset": 2048, 00:23:25.833 "data_size": 63488 00:23:25.833 }, 00:23:25.833 { 00:23:25.833 "name": "BaseBdev2", 00:23:25.833 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:25.833 "is_configured": true, 00:23:25.833 "data_offset": 2048, 00:23:25.833 "data_size": 63488 00:23:25.833 } 00:23:25.833 ] 00:23:25.833 }' 00:23:25.833 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:25.833 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:26.403 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:26.403 13:32:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:23:26.403 [2024-07-25 13:32:07.166014] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:26.403 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:23:26.403 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.403 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:26.664 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:23:26.664 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:23:26.664 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:26.664 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:23:26.924 [2024-07-25 13:32:07.468057] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a3dc0 00:23:26.924 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:26.924 Zero copy mechanism will not be used. 00:23:26.924 Running I/O for 60 seconds... 00:23:26.924 [2024-07-25 13:32:07.549772] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:26.924 [2024-07-25 13:32:07.556192] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x21a3dc0 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:26.924 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:27.184 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:27.184 "name": "raid_bdev1", 00:23:27.184 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:27.184 "strip_size_kb": 0, 00:23:27.184 "state": "online", 00:23:27.184 "raid_level": "raid1", 00:23:27.184 "superblock": true, 00:23:27.184 "num_base_bdevs": 2, 00:23:27.184 "num_base_bdevs_discovered": 1, 00:23:27.184 "num_base_bdevs_operational": 1, 00:23:27.184 "base_bdevs_list": [ 00:23:27.184 { 00:23:27.184 "name": null, 00:23:27.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:27.184 "is_configured": false, 00:23:27.184 "data_offset": 2048, 00:23:27.184 "data_size": 63488 00:23:27.184 }, 00:23:27.184 { 00:23:27.184 "name": "BaseBdev2", 00:23:27.184 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:27.184 "is_configured": true, 00:23:27.184 "data_offset": 2048, 00:23:27.184 "data_size": 63488 00:23:27.184 } 00:23:27.184 ] 00:23:27.184 }' 00:23:27.184 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:27.184 13:32:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:27.753 13:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:27.753 [2024-07-25 13:32:08.498760] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:27.753 13:32:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:27.753 [2024-07-25 13:32:08.543803] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21afd30 00:23:28.013 [2024-07-25 13:32:08.545585] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:28.013 [2024-07-25 13:32:08.666464] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:28.013 [2024-07-25 13:32:08.666670] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:23:28.013 [2024-07-25 13:32:08.788387] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:28.013 [2024-07-25 13:32:08.788489] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:28.583 [2024-07-25 13:32:09.112407] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:28.583 [2024-07-25 13:32:09.232999] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:28.583 [2024-07-25 13:32:09.233119] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:23:28.843 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:28.843 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:28.843 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:28.843 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:28.843 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:28.843 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.843 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.843 [2024-07-25 13:32:09.583128] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:28.843 [2024-07-25 13:32:09.583338] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:29.104 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:29.104 "name": "raid_bdev1", 00:23:29.104 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:29.104 "strip_size_kb": 0, 00:23:29.104 "state": "online", 00:23:29.104 "raid_level": "raid1", 00:23:29.104 "superblock": true, 00:23:29.104 "num_base_bdevs": 2, 00:23:29.104 "num_base_bdevs_discovered": 2, 00:23:29.104 "num_base_bdevs_operational": 2, 00:23:29.104 "process": { 00:23:29.104 "type": "rebuild", 00:23:29.104 "target": "spare", 00:23:29.104 "progress": { 00:23:29.104 "blocks": 14336, 00:23:29.104 "percent": 22 00:23:29.104 } 00:23:29.104 }, 00:23:29.104 "base_bdevs_list": [ 00:23:29.104 { 00:23:29.104 "name": "spare", 00:23:29.104 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:29.104 "is_configured": true, 00:23:29.104 "data_offset": 2048, 00:23:29.104 "data_size": 63488 00:23:29.104 }, 00:23:29.104 { 00:23:29.104 "name": "BaseBdev2", 00:23:29.104 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:29.104 "is_configured": true, 00:23:29.104 "data_offset": 2048, 00:23:29.104 "data_size": 63488 00:23:29.104 } 00:23:29.104 ] 00:23:29.104 }' 00:23:29.104 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:29.104 [2024-07-25 13:32:09.784686] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:29.104 [2024-07-25 13:32:09.784808] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:29.104 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:29.104 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:29.104 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:29.104 13:32:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:29.364 [2024-07-25 13:32:09.990706] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:29.364 [2024-07-25 13:32:10.106444] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:29.364 [2024-07-25 13:32:10.120581] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:29.364 [2024-07-25 13:32:10.120600] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:29.364 [2024-07-25 13:32:10.120606] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:29.364 [2024-07-25 13:32:10.149911] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x21a3dc0 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:29.625 "name": "raid_bdev1", 00:23:29.625 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:29.625 "strip_size_kb": 0, 00:23:29.625 "state": "online", 00:23:29.625 "raid_level": "raid1", 00:23:29.625 "superblock": true, 00:23:29.625 "num_base_bdevs": 2, 00:23:29.625 "num_base_bdevs_discovered": 1, 00:23:29.625 "num_base_bdevs_operational": 1, 00:23:29.625 "base_bdevs_list": [ 00:23:29.625 { 00:23:29.625 "name": null, 00:23:29.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:29.625 "is_configured": false, 00:23:29.625 "data_offset": 2048, 00:23:29.625 "data_size": 63488 00:23:29.625 }, 00:23:29.625 { 00:23:29.625 "name": "BaseBdev2", 00:23:29.625 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:29.625 "is_configured": true, 00:23:29.625 "data_offset": 2048, 00:23:29.625 "data_size": 63488 00:23:29.625 } 00:23:29.625 ] 00:23:29.625 }' 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:29.625 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:30.196 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:30.196 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:30.196 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:30.196 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:30.196 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:30.196 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.196 13:32:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.456 13:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:30.456 "name": "raid_bdev1", 00:23:30.456 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:30.456 "strip_size_kb": 0, 00:23:30.456 "state": "online", 00:23:30.456 "raid_level": "raid1", 00:23:30.456 "superblock": true, 00:23:30.456 "num_base_bdevs": 2, 00:23:30.457 "num_base_bdevs_discovered": 1, 00:23:30.457 "num_base_bdevs_operational": 1, 00:23:30.457 "base_bdevs_list": [ 00:23:30.457 { 00:23:30.457 "name": null, 00:23:30.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.457 "is_configured": false, 00:23:30.457 "data_offset": 2048, 00:23:30.457 "data_size": 63488 00:23:30.457 }, 00:23:30.457 { 00:23:30.457 "name": "BaseBdev2", 00:23:30.457 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:30.457 "is_configured": true, 00:23:30.457 "data_offset": 2048, 00:23:30.457 "data_size": 63488 00:23:30.457 } 00:23:30.457 ] 00:23:30.457 }' 00:23:30.457 13:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:30.457 13:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:30.457 13:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:30.717 13:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:30.717 13:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:30.717 [2024-07-25 13:32:11.440102] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:30.717 13:32:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:23:30.717 [2024-07-25 13:32:11.497669] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21a4a70 00:23:30.717 [2024-07-25 13:32:11.498808] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:30.976 [2024-07-25 13:32:11.752508] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:30.976 [2024-07-25 13:32:11.752648] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:23:31.237 [2024-07-25 13:32:11.981757] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:31.237 [2024-07-25 13:32:11.981959] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:23:31.807 [2024-07-25 13:32:12.412518] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:31.807 [2024-07-25 13:32:12.412797] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:23:31.807 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:31.807 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:31.807 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:31.807 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:31.807 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:31.807 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.807 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.067 [2024-07-25 13:32:12.641674] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:32.067 "name": "raid_bdev1", 00:23:32.067 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:32.067 "strip_size_kb": 0, 00:23:32.067 "state": "online", 00:23:32.067 "raid_level": "raid1", 00:23:32.067 "superblock": true, 00:23:32.067 "num_base_bdevs": 2, 00:23:32.067 "num_base_bdevs_discovered": 2, 00:23:32.067 "num_base_bdevs_operational": 2, 00:23:32.067 "process": { 00:23:32.067 "type": "rebuild", 00:23:32.067 "target": "spare", 00:23:32.067 "progress": { 00:23:32.067 "blocks": 16384, 00:23:32.067 "percent": 25 00:23:32.067 } 00:23:32.067 }, 00:23:32.067 "base_bdevs_list": [ 00:23:32.067 { 00:23:32.067 "name": "spare", 00:23:32.067 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:32.067 "is_configured": true, 00:23:32.067 "data_offset": 2048, 00:23:32.067 "data_size": 63488 00:23:32.067 }, 00:23:32.067 { 00:23:32.067 "name": "BaseBdev2", 00:23:32.067 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:32.067 "is_configured": true, 00:23:32.067 "data_offset": 2048, 00:23:32.067 "data_size": 63488 00:23:32.067 } 00:23:32.067 ] 00:23:32.067 }' 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:23:32.067 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=789 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:32.067 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:32.068 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:32.068 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:32.068 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:32.068 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:32.068 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:32.328 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:32.328 "name": "raid_bdev1", 00:23:32.328 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:32.328 "strip_size_kb": 0, 00:23:32.328 "state": "online", 00:23:32.328 "raid_level": "raid1", 00:23:32.328 "superblock": true, 00:23:32.328 "num_base_bdevs": 2, 00:23:32.328 "num_base_bdevs_discovered": 2, 00:23:32.328 "num_base_bdevs_operational": 2, 00:23:32.328 "process": { 00:23:32.328 "type": "rebuild", 00:23:32.328 "target": "spare", 00:23:32.328 "progress": { 00:23:32.328 "blocks": 18432, 00:23:32.328 "percent": 29 00:23:32.328 } 00:23:32.328 }, 00:23:32.328 "base_bdevs_list": [ 00:23:32.328 { 00:23:32.328 "name": "spare", 00:23:32.328 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:32.328 "is_configured": true, 00:23:32.328 "data_offset": 2048, 00:23:32.328 "data_size": 63488 00:23:32.328 }, 00:23:32.328 { 00:23:32.328 "name": "BaseBdev2", 00:23:32.328 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:32.328 "is_configured": true, 00:23:32.328 "data_offset": 2048, 00:23:32.328 "data_size": 63488 00:23:32.328 } 00:23:32.328 ] 00:23:32.328 }' 00:23:32.328 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.328 [2024-07-25 13:32:12.971959] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:32.328 [2024-07-25 13:32:12.972238] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:23:32.328 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:32.328 13:32:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.328 13:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:32.328 13:32:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:32.588 [2024-07-25 13:32:13.181165] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:23:32.849 [2024-07-25 13:32:13.499389] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:32.849 [2024-07-25 13:32:13.499667] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:23:33.108 [2024-07-25 13:32:13.722242] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:23:33.369 [2024-07-25 13:32:13.943431] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.369 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.369 [2024-07-25 13:32:14.151113] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:23:33.629 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.629 "name": "raid_bdev1", 00:23:33.629 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:33.629 "strip_size_kb": 0, 00:23:33.629 "state": "online", 00:23:33.629 "raid_level": "raid1", 00:23:33.629 "superblock": true, 00:23:33.629 "num_base_bdevs": 2, 00:23:33.629 "num_base_bdevs_discovered": 2, 00:23:33.629 "num_base_bdevs_operational": 2, 00:23:33.629 "process": { 00:23:33.629 "type": "rebuild", 00:23:33.629 "target": "spare", 00:23:33.629 "progress": { 00:23:33.629 "blocks": 34816, 00:23:33.629 "percent": 54 00:23:33.629 } 00:23:33.629 }, 00:23:33.629 "base_bdevs_list": [ 00:23:33.629 { 00:23:33.629 "name": "spare", 00:23:33.629 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:33.629 "is_configured": true, 00:23:33.629 "data_offset": 2048, 00:23:33.629 "data_size": 63488 00:23:33.629 }, 00:23:33.629 { 00:23:33.629 "name": "BaseBdev2", 00:23:33.629 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:33.629 "is_configured": true, 00:23:33.629 "data_offset": 2048, 00:23:33.629 "data_size": 63488 00:23:33.629 } 00:23:33.629 ] 00:23:33.629 }' 00:23:33.629 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.629 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:33.629 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.629 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:33.629 13:32:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:34.569 [2024-07-25 13:32:15.133032] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.569 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.828 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:34.828 "name": "raid_bdev1", 00:23:34.828 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:34.828 "strip_size_kb": 0, 00:23:34.828 "state": "online", 00:23:34.828 "raid_level": "raid1", 00:23:34.828 "superblock": true, 00:23:34.828 "num_base_bdevs": 2, 00:23:34.828 "num_base_bdevs_discovered": 2, 00:23:34.829 "num_base_bdevs_operational": 2, 00:23:34.829 "process": { 00:23:34.829 "type": "rebuild", 00:23:34.829 "target": "spare", 00:23:34.829 "progress": { 00:23:34.829 "blocks": 55296, 00:23:34.829 "percent": 87 00:23:34.829 } 00:23:34.829 }, 00:23:34.829 "base_bdevs_list": [ 00:23:34.829 { 00:23:34.829 "name": "spare", 00:23:34.829 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:34.829 "is_configured": true, 00:23:34.829 "data_offset": 2048, 00:23:34.829 "data_size": 63488 00:23:34.829 }, 00:23:34.829 { 00:23:34.829 "name": "BaseBdev2", 00:23:34.829 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:34.829 "is_configured": true, 00:23:34.829 "data_offset": 2048, 00:23:34.829 "data_size": 63488 00:23:34.829 } 00:23:34.829 ] 00:23:34.829 }' 00:23:34.829 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:34.829 [2024-07-25 13:32:15.579676] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:23:34.829 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:34.829 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.089 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:35.089 13:32:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:23:35.348 [2024-07-25 13:32:16.009741] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:35.348 [2024-07-25 13:32:16.110060] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:35.348 [2024-07-25 13:32:16.110978] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.917 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.179 "name": "raid_bdev1", 00:23:36.179 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:36.179 "strip_size_kb": 0, 00:23:36.179 "state": "online", 00:23:36.179 "raid_level": "raid1", 00:23:36.179 "superblock": true, 00:23:36.179 "num_base_bdevs": 2, 00:23:36.179 "num_base_bdevs_discovered": 2, 00:23:36.179 "num_base_bdevs_operational": 2, 00:23:36.179 "base_bdevs_list": [ 00:23:36.179 { 00:23:36.179 "name": "spare", 00:23:36.179 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:36.179 "is_configured": true, 00:23:36.179 "data_offset": 2048, 00:23:36.179 "data_size": 63488 00:23:36.179 }, 00:23:36.179 { 00:23:36.179 "name": "BaseBdev2", 00:23:36.179 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:36.179 "is_configured": true, 00:23:36.179 "data_offset": 2048, 00:23:36.179 "data_size": 63488 00:23:36.179 } 00:23:36.179 ] 00:23:36.179 }' 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.179 13:32:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.440 "name": "raid_bdev1", 00:23:36.440 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:36.440 "strip_size_kb": 0, 00:23:36.440 "state": "online", 00:23:36.440 "raid_level": "raid1", 00:23:36.440 "superblock": true, 00:23:36.440 "num_base_bdevs": 2, 00:23:36.440 "num_base_bdevs_discovered": 2, 00:23:36.440 "num_base_bdevs_operational": 2, 00:23:36.440 "base_bdevs_list": [ 00:23:36.440 { 00:23:36.440 "name": "spare", 00:23:36.440 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:36.440 "is_configured": true, 00:23:36.440 "data_offset": 2048, 00:23:36.440 "data_size": 63488 00:23:36.440 }, 00:23:36.440 { 00:23:36.440 "name": "BaseBdev2", 00:23:36.440 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:36.440 "is_configured": true, 00:23:36.440 "data_offset": 2048, 00:23:36.440 "data_size": 63488 00:23:36.440 } 00:23:36.440 ] 00:23:36.440 }' 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.440 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.700 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.700 "name": "raid_bdev1", 00:23:36.700 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:36.700 "strip_size_kb": 0, 00:23:36.700 "state": "online", 00:23:36.700 "raid_level": "raid1", 00:23:36.700 "superblock": true, 00:23:36.700 "num_base_bdevs": 2, 00:23:36.700 "num_base_bdevs_discovered": 2, 00:23:36.700 "num_base_bdevs_operational": 2, 00:23:36.700 "base_bdevs_list": [ 00:23:36.700 { 00:23:36.700 "name": "spare", 00:23:36.700 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:36.700 "is_configured": true, 00:23:36.700 "data_offset": 2048, 00:23:36.700 "data_size": 63488 00:23:36.700 }, 00:23:36.700 { 00:23:36.700 "name": "BaseBdev2", 00:23:36.700 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:36.700 "is_configured": true, 00:23:36.700 "data_offset": 2048, 00:23:36.700 "data_size": 63488 00:23:36.700 } 00:23:36.700 ] 00:23:36.700 }' 00:23:36.700 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.700 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:37.270 13:32:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:37.530 [2024-07-25 13:32:18.080415] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:37.530 [2024-07-25 13:32:18.080434] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:37.530 00:23:37.530 Latency(us) 00:23:37.530 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:37.530 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:23:37.530 raid_bdev1 : 10.67 112.60 337.81 0.00 0.00 11684.42 245.76 114536.76 00:23:37.530 =================================================================================================================== 00:23:37.530 Total : 112.60 337.81 0.00 0.00 11684.42 245.76 114536.76 00:23:37.530 [2024-07-25 13:32:18.163810] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:37.530 [2024-07-25 13:32:18.163834] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:37.530 [2024-07-25 13:32:18.163888] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:37.530 [2024-07-25 13:32:18.163895] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21a5c70 name raid_bdev1, state offline 00:23:37.530 0 00:23:37.530 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.530 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:23:37.790 /dev/nbd0 00:23:37.790 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:38.050 1+0 records in 00:23:38.050 1+0 records out 00:23:38.050 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283629 s, 14.4 MB/s 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:38.050 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev2 ']' 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:23:38.051 /dev/nbd1 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:38.051 1+0 records in 00:23:38.051 1+0 records out 00:23:38.051 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297479 s, 13.8 MB/s 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:38.051 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:38.311 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:38.311 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:38.311 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:38.311 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:38.311 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:38.311 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:38.312 13:32:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:38.312 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:38.312 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:38.312 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:38.312 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:38.312 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:38.312 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:23:38.572 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:38.833 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:39.093 [2024-07-25 13:32:19.660171] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:39.093 [2024-07-25 13:32:19.660200] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:39.093 [2024-07-25 13:32:19.660215] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21c1200 00:23:39.093 [2024-07-25 13:32:19.660222] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:39.093 [2024-07-25 13:32:19.661599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:39.093 [2024-07-25 13:32:19.661625] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:39.093 [2024-07-25 13:32:19.661686] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:39.093 [2024-07-25 13:32:19.661707] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:39.093 [2024-07-25 13:32:19.661789] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:39.093 spare 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.093 [2024-07-25 13:32:19.762081] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x21ab970 00:23:39.093 [2024-07-25 13:32:19.762090] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:39.093 [2024-07-25 13:32:19.762245] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ac9e0 00:23:39.093 [2024-07-25 13:32:19.762361] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21ab970 00:23:39.093 [2024-07-25 13:32:19.762367] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x21ab970 00:23:39.093 [2024-07-25 13:32:19.762450] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:39.093 "name": "raid_bdev1", 00:23:39.093 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:39.093 "strip_size_kb": 0, 00:23:39.093 "state": "online", 00:23:39.093 "raid_level": "raid1", 00:23:39.093 "superblock": true, 00:23:39.093 "num_base_bdevs": 2, 00:23:39.093 "num_base_bdevs_discovered": 2, 00:23:39.093 "num_base_bdevs_operational": 2, 00:23:39.093 "base_bdevs_list": [ 00:23:39.093 { 00:23:39.093 "name": "spare", 00:23:39.093 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:39.093 "is_configured": true, 00:23:39.093 "data_offset": 2048, 00:23:39.093 "data_size": 63488 00:23:39.093 }, 00:23:39.093 { 00:23:39.093 "name": "BaseBdev2", 00:23:39.093 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:39.093 "is_configured": true, 00:23:39.093 "data_offset": 2048, 00:23:39.093 "data_size": 63488 00:23:39.093 } 00:23:39.093 ] 00:23:39.093 }' 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:39.093 13:32:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:39.664 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:39.664 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:39.664 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:39.664 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:39.664 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:39.664 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.664 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:39.925 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:39.925 "name": "raid_bdev1", 00:23:39.925 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:39.925 "strip_size_kb": 0, 00:23:39.925 "state": "online", 00:23:39.925 "raid_level": "raid1", 00:23:39.925 "superblock": true, 00:23:39.925 "num_base_bdevs": 2, 00:23:39.925 "num_base_bdevs_discovered": 2, 00:23:39.925 "num_base_bdevs_operational": 2, 00:23:39.925 "base_bdevs_list": [ 00:23:39.925 { 00:23:39.925 "name": "spare", 00:23:39.925 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:39.925 "is_configured": true, 00:23:39.925 "data_offset": 2048, 00:23:39.925 "data_size": 63488 00:23:39.925 }, 00:23:39.925 { 00:23:39.925 "name": "BaseBdev2", 00:23:39.925 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:39.925 "is_configured": true, 00:23:39.925 "data_offset": 2048, 00:23:39.925 "data_size": 63488 00:23:39.925 } 00:23:39.925 ] 00:23:39.925 }' 00:23:39.925 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:39.925 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:39.925 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:39.925 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:39.925 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:39.925 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:40.185 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:23:40.185 13:32:20 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:40.445 [2024-07-25 13:32:21.015884] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:40.446 "name": "raid_bdev1", 00:23:40.446 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:40.446 "strip_size_kb": 0, 00:23:40.446 "state": "online", 00:23:40.446 "raid_level": "raid1", 00:23:40.446 "superblock": true, 00:23:40.446 "num_base_bdevs": 2, 00:23:40.446 "num_base_bdevs_discovered": 1, 00:23:40.446 "num_base_bdevs_operational": 1, 00:23:40.446 "base_bdevs_list": [ 00:23:40.446 { 00:23:40.446 "name": null, 00:23:40.446 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:40.446 "is_configured": false, 00:23:40.446 "data_offset": 2048, 00:23:40.446 "data_size": 63488 00:23:40.446 }, 00:23:40.446 { 00:23:40.446 "name": "BaseBdev2", 00:23:40.446 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:40.446 "is_configured": true, 00:23:40.446 "data_offset": 2048, 00:23:40.446 "data_size": 63488 00:23:40.446 } 00:23:40.446 ] 00:23:40.446 }' 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:40.446 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:41.019 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:41.313 [2024-07-25 13:32:21.890216] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:41.313 [2024-07-25 13:32:21.890328] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:41.313 [2024-07-25 13:32:21.890337] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:41.313 [2024-07-25 13:32:21.890355] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:41.313 [2024-07-25 13:32:21.893957] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ad960 00:23:41.313 [2024-07-25 13:32:21.895432] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:41.313 13:32:21 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:23:42.259 13:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:42.259 13:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:42.259 13:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:42.259 13:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:42.259 13:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:42.259 13:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.259 13:32:22 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:42.519 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:42.519 "name": "raid_bdev1", 00:23:42.519 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:42.519 "strip_size_kb": 0, 00:23:42.519 "state": "online", 00:23:42.519 "raid_level": "raid1", 00:23:42.519 "superblock": true, 00:23:42.519 "num_base_bdevs": 2, 00:23:42.519 "num_base_bdevs_discovered": 2, 00:23:42.519 "num_base_bdevs_operational": 2, 00:23:42.519 "process": { 00:23:42.519 "type": "rebuild", 00:23:42.519 "target": "spare", 00:23:42.519 "progress": { 00:23:42.519 "blocks": 22528, 00:23:42.519 "percent": 35 00:23:42.519 } 00:23:42.519 }, 00:23:42.519 "base_bdevs_list": [ 00:23:42.519 { 00:23:42.519 "name": "spare", 00:23:42.519 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:42.520 "is_configured": true, 00:23:42.520 "data_offset": 2048, 00:23:42.520 "data_size": 63488 00:23:42.520 }, 00:23:42.520 { 00:23:42.520 "name": "BaseBdev2", 00:23:42.520 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:42.520 "is_configured": true, 00:23:42.520 "data_offset": 2048, 00:23:42.520 "data_size": 63488 00:23:42.520 } 00:23:42.520 ] 00:23:42.520 }' 00:23:42.520 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:42.520 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:42.520 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:42.520 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:42.520 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:42.781 [2024-07-25 13:32:23.356258] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:42.781 [2024-07-25 13:32:23.404339] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:42.781 [2024-07-25 13:32:23.404373] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:42.781 [2024-07-25 13:32:23.404383] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:42.781 [2024-07-25 13:32:23.404387] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:42.781 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.040 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.040 "name": "raid_bdev1", 00:23:43.040 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:43.040 "strip_size_kb": 0, 00:23:43.040 "state": "online", 00:23:43.040 "raid_level": "raid1", 00:23:43.040 "superblock": true, 00:23:43.040 "num_base_bdevs": 2, 00:23:43.040 "num_base_bdevs_discovered": 1, 00:23:43.040 "num_base_bdevs_operational": 1, 00:23:43.040 "base_bdevs_list": [ 00:23:43.040 { 00:23:43.040 "name": null, 00:23:43.040 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:43.040 "is_configured": false, 00:23:43.040 "data_offset": 2048, 00:23:43.040 "data_size": 63488 00:23:43.040 }, 00:23:43.040 { 00:23:43.040 "name": "BaseBdev2", 00:23:43.040 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:43.040 "is_configured": true, 00:23:43.040 "data_offset": 2048, 00:23:43.040 "data_size": 63488 00:23:43.040 } 00:23:43.040 ] 00:23:43.040 }' 00:23:43.040 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.040 13:32:23 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:43.609 13:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:43.609 [2024-07-25 13:32:24.334873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:43.609 [2024-07-25 13:32:24.334910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:43.609 [2024-07-25 13:32:24.334924] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21a3aa0 00:23:43.609 [2024-07-25 13:32:24.334931] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:43.609 [2024-07-25 13:32:24.335241] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:43.609 [2024-07-25 13:32:24.335253] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:43.609 [2024-07-25 13:32:24.335314] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:43.609 [2024-07-25 13:32:24.335322] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:43.609 [2024-07-25 13:32:24.335327] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:43.609 [2024-07-25 13:32:24.335339] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:43.609 [2024-07-25 13:32:24.339025] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21ad960 00:23:43.609 spare 00:23:43.609 [2024-07-25 13:32:24.340163] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:43.609 13:32:24 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:44.992 "name": "raid_bdev1", 00:23:44.992 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:44.992 "strip_size_kb": 0, 00:23:44.992 "state": "online", 00:23:44.992 "raid_level": "raid1", 00:23:44.992 "superblock": true, 00:23:44.992 "num_base_bdevs": 2, 00:23:44.992 "num_base_bdevs_discovered": 2, 00:23:44.992 "num_base_bdevs_operational": 2, 00:23:44.992 "process": { 00:23:44.992 "type": "rebuild", 00:23:44.992 "target": "spare", 00:23:44.992 "progress": { 00:23:44.992 "blocks": 22528, 00:23:44.992 "percent": 35 00:23:44.992 } 00:23:44.992 }, 00:23:44.992 "base_bdevs_list": [ 00:23:44.992 { 00:23:44.992 "name": "spare", 00:23:44.992 "uuid": "497280e9-83a5-5233-9411-2ac5087d170d", 00:23:44.992 "is_configured": true, 00:23:44.992 "data_offset": 2048, 00:23:44.992 "data_size": 63488 00:23:44.992 }, 00:23:44.992 { 00:23:44.992 "name": "BaseBdev2", 00:23:44.992 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:44.992 "is_configured": true, 00:23:44.992 "data_offset": 2048, 00:23:44.992 "data_size": 63488 00:23:44.992 } 00:23:44.992 ] 00:23:44.992 }' 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:44.992 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:44.993 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:44.993 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:45.253 [2024-07-25 13:32:25.832412] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:45.253 [2024-07-25 13:32:25.848996] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:45.253 [2024-07-25 13:32:25.849025] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:45.253 [2024-07-25 13:32:25.849035] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:45.253 [2024-07-25 13:32:25.849039] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:45.253 13:32:25 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:45.513 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:45.513 "name": "raid_bdev1", 00:23:45.513 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:45.513 "strip_size_kb": 0, 00:23:45.513 "state": "online", 00:23:45.513 "raid_level": "raid1", 00:23:45.513 "superblock": true, 00:23:45.513 "num_base_bdevs": 2, 00:23:45.513 "num_base_bdevs_discovered": 1, 00:23:45.513 "num_base_bdevs_operational": 1, 00:23:45.513 "base_bdevs_list": [ 00:23:45.513 { 00:23:45.513 "name": null, 00:23:45.513 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:45.513 "is_configured": false, 00:23:45.513 "data_offset": 2048, 00:23:45.513 "data_size": 63488 00:23:45.513 }, 00:23:45.513 { 00:23:45.513 "name": "BaseBdev2", 00:23:45.513 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:45.513 "is_configured": true, 00:23:45.513 "data_offset": 2048, 00:23:45.513 "data_size": 63488 00:23:45.513 } 00:23:45.513 ] 00:23:45.513 }' 00:23:45.513 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:45.513 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:46.082 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:46.082 "name": "raid_bdev1", 00:23:46.082 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:46.082 "strip_size_kb": 0, 00:23:46.082 "state": "online", 00:23:46.082 "raid_level": "raid1", 00:23:46.083 "superblock": true, 00:23:46.083 "num_base_bdevs": 2, 00:23:46.083 "num_base_bdevs_discovered": 1, 00:23:46.083 "num_base_bdevs_operational": 1, 00:23:46.083 "base_bdevs_list": [ 00:23:46.083 { 00:23:46.083 "name": null, 00:23:46.083 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:46.083 "is_configured": false, 00:23:46.083 "data_offset": 2048, 00:23:46.083 "data_size": 63488 00:23:46.083 }, 00:23:46.083 { 00:23:46.083 "name": "BaseBdev2", 00:23:46.083 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:46.083 "is_configured": true, 00:23:46.083 "data_offset": 2048, 00:23:46.083 "data_size": 63488 00:23:46.083 } 00:23:46.083 ] 00:23:46.083 }' 00:23:46.083 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:46.083 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:46.083 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:46.343 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:46.343 13:32:26 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:46.343 13:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:46.603 [2024-07-25 13:32:27.280725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:46.603 [2024-07-25 13:32:27.280759] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:46.603 [2024-07-25 13:32:27.280775] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x21abd00 00:23:46.603 [2024-07-25 13:32:27.280782] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:46.603 [2024-07-25 13:32:27.281060] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:46.603 [2024-07-25 13:32:27.281070] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:46.603 [2024-07-25 13:32:27.281115] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:46.603 [2024-07-25 13:32:27.281123] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:46.603 [2024-07-25 13:32:27.281128] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:46.603 BaseBdev1 00:23:46.603 13:32:27 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:47.545 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:47.805 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:47.805 "name": "raid_bdev1", 00:23:47.805 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:47.805 "strip_size_kb": 0, 00:23:47.805 "state": "online", 00:23:47.805 "raid_level": "raid1", 00:23:47.805 "superblock": true, 00:23:47.805 "num_base_bdevs": 2, 00:23:47.805 "num_base_bdevs_discovered": 1, 00:23:47.805 "num_base_bdevs_operational": 1, 00:23:47.805 "base_bdevs_list": [ 00:23:47.805 { 00:23:47.805 "name": null, 00:23:47.805 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:47.805 "is_configured": false, 00:23:47.805 "data_offset": 2048, 00:23:47.805 "data_size": 63488 00:23:47.805 }, 00:23:47.805 { 00:23:47.805 "name": "BaseBdev2", 00:23:47.805 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:47.805 "is_configured": true, 00:23:47.805 "data_offset": 2048, 00:23:47.805 "data_size": 63488 00:23:47.805 } 00:23:47.805 ] 00:23:47.805 }' 00:23:47.805 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:47.805 13:32:28 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:48.375 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:48.375 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:48.375 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:48.375 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:48.375 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:48.375 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:48.375 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:48.635 "name": "raid_bdev1", 00:23:48.635 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:48.635 "strip_size_kb": 0, 00:23:48.635 "state": "online", 00:23:48.635 "raid_level": "raid1", 00:23:48.635 "superblock": true, 00:23:48.635 "num_base_bdevs": 2, 00:23:48.635 "num_base_bdevs_discovered": 1, 00:23:48.635 "num_base_bdevs_operational": 1, 00:23:48.635 "base_bdevs_list": [ 00:23:48.635 { 00:23:48.635 "name": null, 00:23:48.635 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:48.635 "is_configured": false, 00:23:48.635 "data_offset": 2048, 00:23:48.635 "data_size": 63488 00:23:48.635 }, 00:23:48.635 { 00:23:48.635 "name": "BaseBdev2", 00:23:48.635 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:48.635 "is_configured": true, 00:23:48.635 "data_offset": 2048, 00:23:48.635 "data_size": 63488 00:23:48.635 } 00:23:48.635 ] 00:23:48.635 }' 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:48.635 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:48.636 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:48.897 [2024-07-25 13:32:29.514643] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:48.897 [2024-07-25 13:32:29.514733] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:48.897 [2024-07-25 13:32:29.514741] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:48.897 request: 00:23:48.897 { 00:23:48.897 "base_bdev": "BaseBdev1", 00:23:48.897 "raid_bdev": "raid_bdev1", 00:23:48.897 "method": "bdev_raid_add_base_bdev", 00:23:48.897 "req_id": 1 00:23:48.897 } 00:23:48.897 Got JSON-RPC error response 00:23:48.897 response: 00:23:48.897 { 00:23:48.897 "code": -22, 00:23:48.897 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:48.897 } 00:23:48.897 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:23:48.897 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:23:48.897 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:23:48.897 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:23:48.897 13:32:29 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:49.839 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.098 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:50.099 "name": "raid_bdev1", 00:23:50.099 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:50.099 "strip_size_kb": 0, 00:23:50.099 "state": "online", 00:23:50.099 "raid_level": "raid1", 00:23:50.099 "superblock": true, 00:23:50.099 "num_base_bdevs": 2, 00:23:50.099 "num_base_bdevs_discovered": 1, 00:23:50.099 "num_base_bdevs_operational": 1, 00:23:50.099 "base_bdevs_list": [ 00:23:50.099 { 00:23:50.099 "name": null, 00:23:50.099 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.099 "is_configured": false, 00:23:50.099 "data_offset": 2048, 00:23:50.099 "data_size": 63488 00:23:50.099 }, 00:23:50.099 { 00:23:50.099 "name": "BaseBdev2", 00:23:50.099 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:50.099 "is_configured": true, 00:23:50.099 "data_offset": 2048, 00:23:50.099 "data_size": 63488 00:23:50.099 } 00:23:50.099 ] 00:23:50.099 }' 00:23:50.099 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:50.099 13:32:30 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:50.669 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:50.669 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:50.669 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:50.669 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:50.669 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:50.669 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:50.669 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:50.929 "name": "raid_bdev1", 00:23:50.929 "uuid": "eafaaa1f-e3af-488b-a6e4-680cca4ed5e3", 00:23:50.929 "strip_size_kb": 0, 00:23:50.929 "state": "online", 00:23:50.929 "raid_level": "raid1", 00:23:50.929 "superblock": true, 00:23:50.929 "num_base_bdevs": 2, 00:23:50.929 "num_base_bdevs_discovered": 1, 00:23:50.929 "num_base_bdevs_operational": 1, 00:23:50.929 "base_bdevs_list": [ 00:23:50.929 { 00:23:50.929 "name": null, 00:23:50.929 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:50.929 "is_configured": false, 00:23:50.929 "data_offset": 2048, 00:23:50.929 "data_size": 63488 00:23:50.929 }, 00:23:50.929 { 00:23:50.929 "name": "BaseBdev2", 00:23:50.929 "uuid": "3a8db15a-aeb6-5b4e-a692-88dcf0da8175", 00:23:50.929 "is_configured": true, 00:23:50.929 "data_offset": 2048, 00:23:50.929 "data_size": 63488 00:23:50.929 } 00:23:50.929 ] 00:23:50.929 }' 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1008588 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1008588 ']' 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1008588 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1008588 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1008588' 00:23:50.929 killing process with pid 1008588 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1008588 00:23:50.929 Received shutdown signal, test time was about 24.082305 seconds 00:23:50.929 00:23:50.929 Latency(us) 00:23:50.929 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:50.929 =================================================================================================================== 00:23:50.929 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:50.929 [2024-07-25 13:32:31.608695] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:50.929 [2024-07-25 13:32:31.608762] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:50.929 [2024-07-25 13:32:31.608795] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:50.929 [2024-07-25 13:32:31.608801] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ab970 name raid_bdev1, state offline 00:23:50.929 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1008588 00:23:50.929 [2024-07-25 13:32:31.620523] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:23:51.190 00:23:51.190 real 0m27.895s 00:23:51.190 user 0m43.620s 00:23:51.190 sys 0m3.153s 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:51.190 ************************************ 00:23:51.190 END TEST raid_rebuild_test_sb_io 00:23:51.190 ************************************ 00:23:51.190 13:32:31 bdev_raid -- bdev/bdev_raid.sh@956 -- # for n in 2 4 00:23:51.190 13:32:31 bdev_raid -- bdev/bdev_raid.sh@957 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:23:51.190 13:32:31 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:23:51.190 13:32:31 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:23:51.190 13:32:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:51.190 ************************************ 00:23:51.190 START TEST raid_rebuild_test 00:23:51.190 ************************************ 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false false true 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # local verify=true 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # local strip_size 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@592 -- # local create_arg 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@594 -- # local data_offset 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # raid_pid=1013747 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@613 -- # waitforlisten 1013747 /var/tmp/spdk-raid.sock 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@831 -- # '[' -z 1013747 ']' 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@836 -- # local max_retries=100 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:51.190 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@840 -- # xtrace_disable 00:23:51.190 13:32:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:51.190 [2024-07-25 13:32:31.894513] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:23:51.190 [2024-07-25 13:32:31.894603] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1013747 ] 00:23:51.190 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:51.190 Zero copy mechanism will not be used. 00:23:51.450 [2024-07-25 13:32:31.984331] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:51.450 [2024-07-25 13:32:32.052876] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:23:51.450 [2024-07-25 13:32:32.100680] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:51.450 [2024-07-25 13:32:32.100707] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:52.020 13:32:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:23:52.020 13:32:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@864 -- # return 0 00:23:52.020 13:32:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:52.020 13:32:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:52.279 BaseBdev1_malloc 00:23:52.279 13:32:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:52.539 [2024-07-25 13:32:33.087472] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:52.539 [2024-07-25 13:32:33.087506] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:52.539 [2024-07-25 13:32:33.087519] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17bcd10 00:23:52.539 [2024-07-25 13:32:33.087525] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:52.539 [2024-07-25 13:32:33.088865] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:52.539 [2024-07-25 13:32:33.088887] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:52.539 BaseBdev1 00:23:52.539 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:52.539 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:52.539 BaseBdev2_malloc 00:23:52.539 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:52.798 [2024-07-25 13:32:33.442315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:52.799 [2024-07-25 13:32:33.442341] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:52.799 [2024-07-25 13:32:33.442353] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17bd6d0 00:23:52.799 [2024-07-25 13:32:33.442360] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:52.799 [2024-07-25 13:32:33.443576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:52.799 [2024-07-25 13:32:33.443593] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:52.799 BaseBdev2 00:23:52.799 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:52.799 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:53.059 BaseBdev3_malloc 00:23:53.059 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:53.059 [2024-07-25 13:32:33.829212] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:53.059 [2024-07-25 13:32:33.829239] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.059 [2024-07-25 13:32:33.829250] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1880a30 00:23:53.059 [2024-07-25 13:32:33.829256] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.059 [2024-07-25 13:32:33.830494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.059 [2024-07-25 13:32:33.830511] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:53.059 BaseBdev3 00:23:53.059 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:23:53.059 13:32:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:53.320 BaseBdev4_malloc 00:23:53.320 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:53.580 [2024-07-25 13:32:34.212137] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:53.580 [2024-07-25 13:32:34.212165] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:53.580 [2024-07-25 13:32:34.212177] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b5d60 00:23:53.580 [2024-07-25 13:32:34.212183] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:53.580 [2024-07-25 13:32:34.213421] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:53.580 [2024-07-25 13:32:34.213438] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:53.580 BaseBdev4 00:23:53.580 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:53.840 spare_malloc 00:23:53.840 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:53.840 spare_delay 00:23:53.840 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:54.099 [2024-07-25 13:32:34.771391] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:54.099 [2024-07-25 13:32:34.771419] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:54.099 [2024-07-25 13:32:34.771431] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17b7820 00:23:54.099 [2024-07-25 13:32:34.771438] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:54.099 [2024-07-25 13:32:34.772616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:54.099 [2024-07-25 13:32:34.772633] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:54.099 spare 00:23:54.099 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:54.359 [2024-07-25 13:32:34.959887] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:54.359 [2024-07-25 13:32:34.960887] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:54.359 [2024-07-25 13:32:34.960928] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:54.359 [2024-07-25 13:32:34.960961] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:54.359 [2024-07-25 13:32:34.961022] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x17b8fe0 00:23:54.359 [2024-07-25 13:32:34.961028] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:54.359 [2024-07-25 13:32:34.961186] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b7490 00:23:54.359 [2024-07-25 13:32:34.961299] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17b8fe0 00:23:54.359 [2024-07-25 13:32:34.961305] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17b8fe0 00:23:54.359 [2024-07-25 13:32:34.961386] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.359 13:32:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.619 13:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:54.619 "name": "raid_bdev1", 00:23:54.619 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:23:54.619 "strip_size_kb": 0, 00:23:54.619 "state": "online", 00:23:54.619 "raid_level": "raid1", 00:23:54.619 "superblock": false, 00:23:54.619 "num_base_bdevs": 4, 00:23:54.619 "num_base_bdevs_discovered": 4, 00:23:54.619 "num_base_bdevs_operational": 4, 00:23:54.619 "base_bdevs_list": [ 00:23:54.619 { 00:23:54.619 "name": "BaseBdev1", 00:23:54.619 "uuid": "84f390ad-276a-5876-9902-c5030463e2cd", 00:23:54.619 "is_configured": true, 00:23:54.619 "data_offset": 0, 00:23:54.619 "data_size": 65536 00:23:54.619 }, 00:23:54.619 { 00:23:54.619 "name": "BaseBdev2", 00:23:54.619 "uuid": "693c2f9c-3d1b-50b3-aa42-545715258e54", 00:23:54.619 "is_configured": true, 00:23:54.619 "data_offset": 0, 00:23:54.619 "data_size": 65536 00:23:54.619 }, 00:23:54.619 { 00:23:54.619 "name": "BaseBdev3", 00:23:54.619 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:23:54.619 "is_configured": true, 00:23:54.619 "data_offset": 0, 00:23:54.619 "data_size": 65536 00:23:54.619 }, 00:23:54.619 { 00:23:54.619 "name": "BaseBdev4", 00:23:54.619 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:23:54.619 "is_configured": true, 00:23:54.619 "data_offset": 0, 00:23:54.619 "data_size": 65536 00:23:54.619 } 00:23:54.619 ] 00:23:54.619 }' 00:23:54.619 13:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:54.619 13:32:35 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:55.188 13:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:55.188 13:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:23:55.188 [2024-07-25 13:32:35.882438] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:55.188 13:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:23:55.188 13:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.188 13:32:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:55.447 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:55.707 [2024-07-25 13:32:36.279219] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b4c70 00:23:55.707 /dev/nbd0 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:55.707 1+0 records in 00:23:55.707 1+0 records out 00:23:55.707 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000284132 s, 14.4 MB/s 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:23:55.707 13:32:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:24:05.700 65536+0 records in 00:24:05.700 65536+0 records out 00:24:05.700 33554432 bytes (34 MB, 32 MiB) copied, 8.43029 s, 4.0 MB/s 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:05.700 [2024-07-25 13:32:44.961974] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:05.700 13:32:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:05.700 [2024-07-25 13:32:45.130426] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.700 "name": "raid_bdev1", 00:24:05.700 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:05.700 "strip_size_kb": 0, 00:24:05.700 "state": "online", 00:24:05.700 "raid_level": "raid1", 00:24:05.700 "superblock": false, 00:24:05.700 "num_base_bdevs": 4, 00:24:05.700 "num_base_bdevs_discovered": 3, 00:24:05.700 "num_base_bdevs_operational": 3, 00:24:05.700 "base_bdevs_list": [ 00:24:05.700 { 00:24:05.700 "name": null, 00:24:05.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.700 "is_configured": false, 00:24:05.700 "data_offset": 0, 00:24:05.700 "data_size": 65536 00:24:05.700 }, 00:24:05.700 { 00:24:05.700 "name": "BaseBdev2", 00:24:05.700 "uuid": "693c2f9c-3d1b-50b3-aa42-545715258e54", 00:24:05.700 "is_configured": true, 00:24:05.700 "data_offset": 0, 00:24:05.700 "data_size": 65536 00:24:05.700 }, 00:24:05.700 { 00:24:05.700 "name": "BaseBdev3", 00:24:05.700 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:05.700 "is_configured": true, 00:24:05.700 "data_offset": 0, 00:24:05.700 "data_size": 65536 00:24:05.700 }, 00:24:05.700 { 00:24:05.700 "name": "BaseBdev4", 00:24:05.700 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:05.700 "is_configured": true, 00:24:05.700 "data_offset": 0, 00:24:05.700 "data_size": 65536 00:24:05.700 } 00:24:05.700 ] 00:24:05.700 }' 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:05.700 13:32:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:05.700 [2024-07-25 13:32:46.020687] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:05.700 [2024-07-25 13:32:46.023555] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b4c70 00:24:05.700 [2024-07-25 13:32:46.025184] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:05.700 13:32:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:06.270 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:06.270 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:06.270 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:06.270 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:06.270 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:06.270 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.270 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:06.530 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:06.530 "name": "raid_bdev1", 00:24:06.530 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:06.530 "strip_size_kb": 0, 00:24:06.530 "state": "online", 00:24:06.530 "raid_level": "raid1", 00:24:06.530 "superblock": false, 00:24:06.530 "num_base_bdevs": 4, 00:24:06.530 "num_base_bdevs_discovered": 4, 00:24:06.530 "num_base_bdevs_operational": 4, 00:24:06.530 "process": { 00:24:06.530 "type": "rebuild", 00:24:06.530 "target": "spare", 00:24:06.530 "progress": { 00:24:06.530 "blocks": 22528, 00:24:06.530 "percent": 34 00:24:06.530 } 00:24:06.530 }, 00:24:06.530 "base_bdevs_list": [ 00:24:06.530 { 00:24:06.530 "name": "spare", 00:24:06.530 "uuid": "9659bde7-28ee-50ea-8403-7aeb2979851b", 00:24:06.530 "is_configured": true, 00:24:06.530 "data_offset": 0, 00:24:06.530 "data_size": 65536 00:24:06.530 }, 00:24:06.530 { 00:24:06.530 "name": "BaseBdev2", 00:24:06.530 "uuid": "693c2f9c-3d1b-50b3-aa42-545715258e54", 00:24:06.530 "is_configured": true, 00:24:06.530 "data_offset": 0, 00:24:06.530 "data_size": 65536 00:24:06.530 }, 00:24:06.530 { 00:24:06.530 "name": "BaseBdev3", 00:24:06.530 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:06.530 "is_configured": true, 00:24:06.530 "data_offset": 0, 00:24:06.530 "data_size": 65536 00:24:06.530 }, 00:24:06.530 { 00:24:06.530 "name": "BaseBdev4", 00:24:06.530 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:06.530 "is_configured": true, 00:24:06.530 "data_offset": 0, 00:24:06.530 "data_size": 65536 00:24:06.530 } 00:24:06.530 ] 00:24:06.530 }' 00:24:06.530 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:06.530 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:06.530 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:06.791 [2024-07-25 13:32:47.505441] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.791 [2024-07-25 13:32:47.534105] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:06.791 [2024-07-25 13:32:47.534137] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:06.791 [2024-07-25 13:32:47.534148] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:06.791 [2024-07-25 13:32:47.534153] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:06.791 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.052 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.052 "name": "raid_bdev1", 00:24:07.052 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:07.052 "strip_size_kb": 0, 00:24:07.052 "state": "online", 00:24:07.052 "raid_level": "raid1", 00:24:07.052 "superblock": false, 00:24:07.052 "num_base_bdevs": 4, 00:24:07.052 "num_base_bdevs_discovered": 3, 00:24:07.052 "num_base_bdevs_operational": 3, 00:24:07.052 "base_bdevs_list": [ 00:24:07.052 { 00:24:07.052 "name": null, 00:24:07.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.052 "is_configured": false, 00:24:07.052 "data_offset": 0, 00:24:07.052 "data_size": 65536 00:24:07.052 }, 00:24:07.052 { 00:24:07.052 "name": "BaseBdev2", 00:24:07.052 "uuid": "693c2f9c-3d1b-50b3-aa42-545715258e54", 00:24:07.052 "is_configured": true, 00:24:07.052 "data_offset": 0, 00:24:07.052 "data_size": 65536 00:24:07.052 }, 00:24:07.052 { 00:24:07.052 "name": "BaseBdev3", 00:24:07.052 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:07.052 "is_configured": true, 00:24:07.052 "data_offset": 0, 00:24:07.052 "data_size": 65536 00:24:07.052 }, 00:24:07.052 { 00:24:07.052 "name": "BaseBdev4", 00:24:07.052 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:07.052 "is_configured": true, 00:24:07.052 "data_offset": 0, 00:24:07.052 "data_size": 65536 00:24:07.052 } 00:24:07.052 ] 00:24:07.052 }' 00:24:07.052 13:32:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.052 13:32:47 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:07.624 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:07.624 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.624 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:07.624 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:07.624 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.624 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.624 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.884 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.884 "name": "raid_bdev1", 00:24:07.884 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:07.884 "strip_size_kb": 0, 00:24:07.884 "state": "online", 00:24:07.884 "raid_level": "raid1", 00:24:07.884 "superblock": false, 00:24:07.884 "num_base_bdevs": 4, 00:24:07.884 "num_base_bdevs_discovered": 3, 00:24:07.884 "num_base_bdevs_operational": 3, 00:24:07.884 "base_bdevs_list": [ 00:24:07.884 { 00:24:07.884 "name": null, 00:24:07.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.884 "is_configured": false, 00:24:07.884 "data_offset": 0, 00:24:07.884 "data_size": 65536 00:24:07.884 }, 00:24:07.884 { 00:24:07.884 "name": "BaseBdev2", 00:24:07.884 "uuid": "693c2f9c-3d1b-50b3-aa42-545715258e54", 00:24:07.884 "is_configured": true, 00:24:07.884 "data_offset": 0, 00:24:07.884 "data_size": 65536 00:24:07.884 }, 00:24:07.884 { 00:24:07.884 "name": "BaseBdev3", 00:24:07.884 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:07.884 "is_configured": true, 00:24:07.884 "data_offset": 0, 00:24:07.884 "data_size": 65536 00:24:07.884 }, 00:24:07.884 { 00:24:07.884 "name": "BaseBdev4", 00:24:07.884 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:07.884 "is_configured": true, 00:24:07.884 "data_offset": 0, 00:24:07.884 "data_size": 65536 00:24:07.884 } 00:24:07.884 ] 00:24:07.884 }' 00:24:07.884 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.884 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:07.884 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.884 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:07.884 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:08.144 [2024-07-25 13:32:48.780853] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:08.144 [2024-07-25 13:32:48.783632] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17b8a20 00:24:08.144 [2024-07-25 13:32:48.784799] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:08.144 13:32:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:09.085 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:09.085 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.085 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:09.085 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:09.085 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.085 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.085 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.346 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.346 "name": "raid_bdev1", 00:24:09.346 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:09.346 "strip_size_kb": 0, 00:24:09.346 "state": "online", 00:24:09.346 "raid_level": "raid1", 00:24:09.346 "superblock": false, 00:24:09.346 "num_base_bdevs": 4, 00:24:09.346 "num_base_bdevs_discovered": 4, 00:24:09.346 "num_base_bdevs_operational": 4, 00:24:09.346 "process": { 00:24:09.346 "type": "rebuild", 00:24:09.346 "target": "spare", 00:24:09.346 "progress": { 00:24:09.346 "blocks": 22528, 00:24:09.346 "percent": 34 00:24:09.346 } 00:24:09.346 }, 00:24:09.346 "base_bdevs_list": [ 00:24:09.346 { 00:24:09.346 "name": "spare", 00:24:09.346 "uuid": "9659bde7-28ee-50ea-8403-7aeb2979851b", 00:24:09.346 "is_configured": true, 00:24:09.346 "data_offset": 0, 00:24:09.346 "data_size": 65536 00:24:09.346 }, 00:24:09.346 { 00:24:09.346 "name": "BaseBdev2", 00:24:09.346 "uuid": "693c2f9c-3d1b-50b3-aa42-545715258e54", 00:24:09.346 "is_configured": true, 00:24:09.346 "data_offset": 0, 00:24:09.346 "data_size": 65536 00:24:09.346 }, 00:24:09.346 { 00:24:09.346 "name": "BaseBdev3", 00:24:09.346 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:09.346 "is_configured": true, 00:24:09.346 "data_offset": 0, 00:24:09.346 "data_size": 65536 00:24:09.346 }, 00:24:09.346 { 00:24:09.346 "name": "BaseBdev4", 00:24:09.346 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:09.346 "is_configured": true, 00:24:09.346 "data_offset": 0, 00:24:09.346 "data_size": 65536 00:24:09.346 } 00:24:09.346 ] 00:24:09.346 }' 00:24:09.346 13:32:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:24:09.346 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:09.607 [2024-07-25 13:32:50.265616] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:09.607 [2024-07-25 13:32:50.293650] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x17b8a20 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.607 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:09.868 "name": "raid_bdev1", 00:24:09.868 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:09.868 "strip_size_kb": 0, 00:24:09.868 "state": "online", 00:24:09.868 "raid_level": "raid1", 00:24:09.868 "superblock": false, 00:24:09.868 "num_base_bdevs": 4, 00:24:09.868 "num_base_bdevs_discovered": 3, 00:24:09.868 "num_base_bdevs_operational": 3, 00:24:09.868 "process": { 00:24:09.868 "type": "rebuild", 00:24:09.868 "target": "spare", 00:24:09.868 "progress": { 00:24:09.868 "blocks": 34816, 00:24:09.868 "percent": 53 00:24:09.868 } 00:24:09.868 }, 00:24:09.868 "base_bdevs_list": [ 00:24:09.868 { 00:24:09.868 "name": "spare", 00:24:09.868 "uuid": "9659bde7-28ee-50ea-8403-7aeb2979851b", 00:24:09.868 "is_configured": true, 00:24:09.868 "data_offset": 0, 00:24:09.868 "data_size": 65536 00:24:09.868 }, 00:24:09.868 { 00:24:09.868 "name": null, 00:24:09.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:09.868 "is_configured": false, 00:24:09.868 "data_offset": 0, 00:24:09.868 "data_size": 65536 00:24:09.868 }, 00:24:09.868 { 00:24:09.868 "name": "BaseBdev3", 00:24:09.868 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:09.868 "is_configured": true, 00:24:09.868 "data_offset": 0, 00:24:09.868 "data_size": 65536 00:24:09.868 }, 00:24:09.868 { 00:24:09.868 "name": "BaseBdev4", 00:24:09.868 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:09.868 "is_configured": true, 00:24:09.868 "data_offset": 0, 00:24:09.868 "data_size": 65536 00:24:09.868 } 00:24:09.868 ] 00:24:09.868 }' 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # local timeout=827 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:09.868 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.128 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.128 "name": "raid_bdev1", 00:24:10.128 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:10.128 "strip_size_kb": 0, 00:24:10.128 "state": "online", 00:24:10.128 "raid_level": "raid1", 00:24:10.128 "superblock": false, 00:24:10.128 "num_base_bdevs": 4, 00:24:10.128 "num_base_bdevs_discovered": 3, 00:24:10.128 "num_base_bdevs_operational": 3, 00:24:10.128 "process": { 00:24:10.128 "type": "rebuild", 00:24:10.128 "target": "spare", 00:24:10.128 "progress": { 00:24:10.129 "blocks": 38912, 00:24:10.129 "percent": 59 00:24:10.129 } 00:24:10.129 }, 00:24:10.129 "base_bdevs_list": [ 00:24:10.129 { 00:24:10.129 "name": "spare", 00:24:10.129 "uuid": "9659bde7-28ee-50ea-8403-7aeb2979851b", 00:24:10.129 "is_configured": true, 00:24:10.129 "data_offset": 0, 00:24:10.129 "data_size": 65536 00:24:10.129 }, 00:24:10.129 { 00:24:10.129 "name": null, 00:24:10.129 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.129 "is_configured": false, 00:24:10.129 "data_offset": 0, 00:24:10.129 "data_size": 65536 00:24:10.129 }, 00:24:10.129 { 00:24:10.129 "name": "BaseBdev3", 00:24:10.129 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:10.129 "is_configured": true, 00:24:10.129 "data_offset": 0, 00:24:10.129 "data_size": 65536 00:24:10.129 }, 00:24:10.129 { 00:24:10.129 "name": "BaseBdev4", 00:24:10.129 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:10.129 "is_configured": true, 00:24:10.129 "data_offset": 0, 00:24:10.129 "data_size": 65536 00:24:10.129 } 00:24:10.129 ] 00:24:10.129 }' 00:24:10.129 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.129 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.129 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.129 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.129 13:32:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.510 13:32:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.510 [2024-07-25 13:32:52.003640] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:11.510 [2024-07-25 13:32:52.003682] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:11.510 [2024-07-25 13:32:52.003711] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.510 "name": "raid_bdev1", 00:24:11.510 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:11.510 "strip_size_kb": 0, 00:24:11.510 "state": "online", 00:24:11.510 "raid_level": "raid1", 00:24:11.510 "superblock": false, 00:24:11.510 "num_base_bdevs": 4, 00:24:11.510 "num_base_bdevs_discovered": 3, 00:24:11.510 "num_base_bdevs_operational": 3, 00:24:11.510 "base_bdevs_list": [ 00:24:11.510 { 00:24:11.510 "name": "spare", 00:24:11.510 "uuid": "9659bde7-28ee-50ea-8403-7aeb2979851b", 00:24:11.510 "is_configured": true, 00:24:11.510 "data_offset": 0, 00:24:11.510 "data_size": 65536 00:24:11.510 }, 00:24:11.510 { 00:24:11.510 "name": null, 00:24:11.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.510 "is_configured": false, 00:24:11.510 "data_offset": 0, 00:24:11.510 "data_size": 65536 00:24:11.510 }, 00:24:11.510 { 00:24:11.510 "name": "BaseBdev3", 00:24:11.510 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:11.510 "is_configured": true, 00:24:11.510 "data_offset": 0, 00:24:11.510 "data_size": 65536 00:24:11.510 }, 00:24:11.510 { 00:24:11.510 "name": "BaseBdev4", 00:24:11.510 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:11.510 "is_configured": true, 00:24:11.510 "data_offset": 0, 00:24:11.510 "data_size": 65536 00:24:11.510 } 00:24:11.510 ] 00:24:11.510 }' 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@724 -- # break 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.510 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.769 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.769 "name": "raid_bdev1", 00:24:11.770 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:11.770 "strip_size_kb": 0, 00:24:11.770 "state": "online", 00:24:11.770 "raid_level": "raid1", 00:24:11.770 "superblock": false, 00:24:11.770 "num_base_bdevs": 4, 00:24:11.770 "num_base_bdevs_discovered": 3, 00:24:11.770 "num_base_bdevs_operational": 3, 00:24:11.770 "base_bdevs_list": [ 00:24:11.770 { 00:24:11.770 "name": "spare", 00:24:11.770 "uuid": "9659bde7-28ee-50ea-8403-7aeb2979851b", 00:24:11.770 "is_configured": true, 00:24:11.770 "data_offset": 0, 00:24:11.770 "data_size": 65536 00:24:11.770 }, 00:24:11.770 { 00:24:11.770 "name": null, 00:24:11.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.770 "is_configured": false, 00:24:11.770 "data_offset": 0, 00:24:11.770 "data_size": 65536 00:24:11.770 }, 00:24:11.770 { 00:24:11.770 "name": "BaseBdev3", 00:24:11.770 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:11.770 "is_configured": true, 00:24:11.770 "data_offset": 0, 00:24:11.770 "data_size": 65536 00:24:11.770 }, 00:24:11.770 { 00:24:11.770 "name": "BaseBdev4", 00:24:11.770 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:11.770 "is_configured": true, 00:24:11.770 "data_offset": 0, 00:24:11.770 "data_size": 65536 00:24:11.770 } 00:24:11.770 ] 00:24:11.770 }' 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.770 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:12.029 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:12.029 "name": "raid_bdev1", 00:24:12.029 "uuid": "dfa2bf66-086e-495e-bf98-c2828c68e9b3", 00:24:12.029 "strip_size_kb": 0, 00:24:12.029 "state": "online", 00:24:12.029 "raid_level": "raid1", 00:24:12.029 "superblock": false, 00:24:12.029 "num_base_bdevs": 4, 00:24:12.029 "num_base_bdevs_discovered": 3, 00:24:12.029 "num_base_bdevs_operational": 3, 00:24:12.029 "base_bdevs_list": [ 00:24:12.029 { 00:24:12.029 "name": "spare", 00:24:12.029 "uuid": "9659bde7-28ee-50ea-8403-7aeb2979851b", 00:24:12.029 "is_configured": true, 00:24:12.029 "data_offset": 0, 00:24:12.029 "data_size": 65536 00:24:12.029 }, 00:24:12.029 { 00:24:12.029 "name": null, 00:24:12.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:12.029 "is_configured": false, 00:24:12.029 "data_offset": 0, 00:24:12.029 "data_size": 65536 00:24:12.029 }, 00:24:12.029 { 00:24:12.029 "name": "BaseBdev3", 00:24:12.029 "uuid": "c170a457-4dec-5a30-afde-598b20e5716b", 00:24:12.029 "is_configured": true, 00:24:12.029 "data_offset": 0, 00:24:12.029 "data_size": 65536 00:24:12.029 }, 00:24:12.029 { 00:24:12.029 "name": "BaseBdev4", 00:24:12.029 "uuid": "c17ea8a3-1fe5-5fca-8777-00ce75a52d43", 00:24:12.029 "is_configured": true, 00:24:12.029 "data_offset": 0, 00:24:12.029 "data_size": 65536 00:24:12.029 } 00:24:12.029 ] 00:24:12.029 }' 00:24:12.029 13:32:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:12.029 13:32:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:12.598 13:32:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:12.858 [2024-07-25 13:32:53.390505] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:12.858 [2024-07-25 13:32:53.390523] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:12.858 [2024-07-25 13:32:53.390568] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:12.858 [2024-07-25 13:32:53.390618] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:12.858 [2024-07-25 13:32:53.390625] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17b8fe0 name raid_bdev1, state offline 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # jq length 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:12.858 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:13.119 /dev/nbd0 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:13.119 1+0 records in 00:24:13.119 1+0 records out 00:24:13.119 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000298191 s, 13.7 MB/s 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:13.119 13:32:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:13.380 /dev/nbd1 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@869 -- # local i 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@873 -- # break 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:13.380 1+0 records in 00:24:13.380 1+0 records out 00:24:13.380 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000295033 s, 13.9 MB/s 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # size=4096 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@889 -- # return 0 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:13.380 13:32:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@753 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:13.641 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@798 -- # killprocess 1013747 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@950 -- # '[' -z 1013747 ']' 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # kill -0 1013747 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # uname 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1013747 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1013747' 00:24:13.902 killing process with pid 1013747 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@969 -- # kill 1013747 00:24:13.902 Received shutdown signal, test time was about 60.000000 seconds 00:24:13.902 00:24:13.902 Latency(us) 00:24:13.902 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:13.902 =================================================================================================================== 00:24:13.902 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:13.902 [2024-07-25 13:32:54.675856] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:13.902 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@974 -- # wait 1013747 00:24:14.162 [2024-07-25 13:32:54.701839] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@800 -- # return 0 00:24:14.162 00:24:14.162 real 0m22.993s 00:24:14.162 user 0m30.604s 00:24:14.162 sys 0m4.041s 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:24:14.162 ************************************ 00:24:14.162 END TEST raid_rebuild_test 00:24:14.162 ************************************ 00:24:14.162 13:32:54 bdev_raid -- bdev/bdev_raid.sh@958 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:24:14.162 13:32:54 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:14.162 13:32:54 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:14.162 13:32:54 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:14.162 ************************************ 00:24:14.162 START TEST raid_rebuild_test_sb 00:24:14.162 ************************************ 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true false true 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # local verify=true 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # local strip_size 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # local create_arg 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@594 -- # local data_offset 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # raid_pid=1017793 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@613 -- # waitforlisten 1017793 /var/tmp/spdk-raid.sock 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@831 -- # '[' -z 1017793 ']' 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:14.162 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:14.162 13:32:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:14.423 [2024-07-25 13:32:54.971418] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:24:14.423 [2024-07-25 13:32:54.971469] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1017793 ] 00:24:14.423 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:14.423 Zero copy mechanism will not be used. 00:24:14.423 [2024-07-25 13:32:55.053354] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:14.423 [2024-07-25 13:32:55.120385] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:14.423 [2024-07-25 13:32:55.161422] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:14.423 [2024-07-25 13:32:55.161446] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:15.024 13:32:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:15.024 13:32:55 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@864 -- # return 0 00:24:15.024 13:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:15.024 13:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:15.307 BaseBdev1_malloc 00:24:15.307 13:32:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:15.568 [2024-07-25 13:32:56.159599] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:15.568 [2024-07-25 13:32:56.159632] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.568 [2024-07-25 13:32:56.159646] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x173bd10 00:24:15.568 [2024-07-25 13:32:56.159653] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.568 [2024-07-25 13:32:56.160902] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.568 [2024-07-25 13:32:56.160921] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:15.568 BaseBdev1 00:24:15.568 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:15.568 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:15.568 BaseBdev2_malloc 00:24:15.568 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:15.828 [2024-07-25 13:32:56.530428] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:15.828 [2024-07-25 13:32:56.530457] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:15.828 [2024-07-25 13:32:56.530470] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x173c6d0 00:24:15.828 [2024-07-25 13:32:56.530481] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:15.828 [2024-07-25 13:32:56.531657] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:15.828 [2024-07-25 13:32:56.531674] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:15.828 BaseBdev2 00:24:15.828 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:15.828 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:16.089 BaseBdev3_malloc 00:24:16.089 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:16.349 [2024-07-25 13:32:56.897237] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:16.349 [2024-07-25 13:32:56.897263] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.349 [2024-07-25 13:32:56.897273] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17ffa30 00:24:16.349 [2024-07-25 13:32:56.897279] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.349 [2024-07-25 13:32:56.898443] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.349 [2024-07-25 13:32:56.898461] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:16.349 BaseBdev3 00:24:16.349 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:16.349 13:32:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:16.349 BaseBdev4_malloc 00:24:16.349 13:32:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:16.609 [2024-07-25 13:32:57.284161] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:16.609 [2024-07-25 13:32:57.284192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:16.609 [2024-07-25 13:32:57.284206] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1734d60 00:24:16.609 [2024-07-25 13:32:57.284212] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:16.609 [2024-07-25 13:32:57.285398] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:16.609 [2024-07-25 13:32:57.285417] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:16.609 BaseBdev4 00:24:16.609 13:32:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:16.870 spare_malloc 00:24:16.870 13:32:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:17.129 spare_delay 00:24:17.129 13:32:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:17.129 [2024-07-25 13:32:57.863532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:17.129 [2024-07-25 13:32:57.863567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:17.129 [2024-07-25 13:32:57.863581] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1736820 00:24:17.129 [2024-07-25 13:32:57.863589] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:17.129 [2024-07-25 13:32:57.864783] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:17.129 [2024-07-25 13:32:57.864802] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:17.129 spare 00:24:17.129 13:32:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:17.390 [2024-07-25 13:32:58.052041] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:17.390 [2024-07-25 13:32:58.053042] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:17.390 [2024-07-25 13:32:58.053084] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:17.390 [2024-07-25 13:32:58.053118] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:17.390 [2024-07-25 13:32:58.053248] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1737fe0 00:24:17.390 [2024-07-25 13:32:58.053255] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:17.390 [2024-07-25 13:32:58.053413] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1736490 00:24:17.390 [2024-07-25 13:32:58.053529] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1737fe0 00:24:17.390 [2024-07-25 13:32:58.053534] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1737fe0 00:24:17.390 [2024-07-25 13:32:58.053623] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:17.390 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:17.651 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:17.651 "name": "raid_bdev1", 00:24:17.651 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:17.651 "strip_size_kb": 0, 00:24:17.651 "state": "online", 00:24:17.651 "raid_level": "raid1", 00:24:17.651 "superblock": true, 00:24:17.651 "num_base_bdevs": 4, 00:24:17.651 "num_base_bdevs_discovered": 4, 00:24:17.651 "num_base_bdevs_operational": 4, 00:24:17.651 "base_bdevs_list": [ 00:24:17.651 { 00:24:17.651 "name": "BaseBdev1", 00:24:17.651 "uuid": "0b0f90b3-c3bf-588a-87ca-eff4a1e92941", 00:24:17.651 "is_configured": true, 00:24:17.651 "data_offset": 2048, 00:24:17.651 "data_size": 63488 00:24:17.651 }, 00:24:17.651 { 00:24:17.651 "name": "BaseBdev2", 00:24:17.651 "uuid": "bbee81a0-721c-500d-8375-c9cd7c6467f8", 00:24:17.651 "is_configured": true, 00:24:17.651 "data_offset": 2048, 00:24:17.651 "data_size": 63488 00:24:17.651 }, 00:24:17.651 { 00:24:17.651 "name": "BaseBdev3", 00:24:17.651 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:17.651 "is_configured": true, 00:24:17.651 "data_offset": 2048, 00:24:17.651 "data_size": 63488 00:24:17.651 }, 00:24:17.651 { 00:24:17.651 "name": "BaseBdev4", 00:24:17.651 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:17.651 "is_configured": true, 00:24:17.651 "data_offset": 2048, 00:24:17.651 "data_size": 63488 00:24:17.651 } 00:24:17.651 ] 00:24:17.651 }' 00:24:17.651 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:17.651 13:32:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:18.221 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:24:18.221 13:32:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:18.481 [2024-07-25 13:32:59.026727] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:18.481 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:24:18.742 [2024-07-25 13:32:59.423516] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1733c70 00:24:18.742 /dev/nbd0 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:18.742 1+0 records in 00:24:18.742 1+0 records out 00:24:18.742 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277769 s, 14.7 MB/s 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:24:18.742 13:32:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:24:28.736 63488+0 records in 00:24:28.736 63488+0 records out 00:24:28.736 32505856 bytes (33 MB, 31 MiB) copied, 8.31941 s, 3.9 MB/s 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:28.736 13:33:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:28.736 [2024-07-25 13:33:08.000209] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:28.736 [2024-07-25 13:33:08.178579] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.736 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:28.736 "name": "raid_bdev1", 00:24:28.736 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:28.736 "strip_size_kb": 0, 00:24:28.736 "state": "online", 00:24:28.736 "raid_level": "raid1", 00:24:28.736 "superblock": true, 00:24:28.736 "num_base_bdevs": 4, 00:24:28.736 "num_base_bdevs_discovered": 3, 00:24:28.736 "num_base_bdevs_operational": 3, 00:24:28.736 "base_bdevs_list": [ 00:24:28.736 { 00:24:28.736 "name": null, 00:24:28.736 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:28.736 "is_configured": false, 00:24:28.736 "data_offset": 2048, 00:24:28.736 "data_size": 63488 00:24:28.736 }, 00:24:28.736 { 00:24:28.736 "name": "BaseBdev2", 00:24:28.736 "uuid": "bbee81a0-721c-500d-8375-c9cd7c6467f8", 00:24:28.736 "is_configured": true, 00:24:28.736 "data_offset": 2048, 00:24:28.736 "data_size": 63488 00:24:28.736 }, 00:24:28.736 { 00:24:28.736 "name": "BaseBdev3", 00:24:28.736 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:28.736 "is_configured": true, 00:24:28.736 "data_offset": 2048, 00:24:28.736 "data_size": 63488 00:24:28.736 }, 00:24:28.736 { 00:24:28.736 "name": "BaseBdev4", 00:24:28.736 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:28.736 "is_configured": true, 00:24:28.737 "data_offset": 2048, 00:24:28.737 "data_size": 63488 00:24:28.737 } 00:24:28.737 ] 00:24:28.737 }' 00:24:28.737 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:28.737 13:33:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:28.737 13:33:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:28.737 [2024-07-25 13:33:09.056799] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:28.737 [2024-07-25 13:33:09.059658] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1733c70 00:24:28.737 [2024-07-25 13:33:09.061287] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:28.737 13:33:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:29.306 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.306 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.306 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:29.306 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:29.306 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.306 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.306 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.565 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.565 "name": "raid_bdev1", 00:24:29.565 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:29.565 "strip_size_kb": 0, 00:24:29.565 "state": "online", 00:24:29.565 "raid_level": "raid1", 00:24:29.565 "superblock": true, 00:24:29.565 "num_base_bdevs": 4, 00:24:29.565 "num_base_bdevs_discovered": 4, 00:24:29.565 "num_base_bdevs_operational": 4, 00:24:29.565 "process": { 00:24:29.565 "type": "rebuild", 00:24:29.565 "target": "spare", 00:24:29.565 "progress": { 00:24:29.565 "blocks": 22528, 00:24:29.565 "percent": 35 00:24:29.565 } 00:24:29.565 }, 00:24:29.565 "base_bdevs_list": [ 00:24:29.565 { 00:24:29.565 "name": "spare", 00:24:29.565 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:29.565 "is_configured": true, 00:24:29.565 "data_offset": 2048, 00:24:29.565 "data_size": 63488 00:24:29.565 }, 00:24:29.565 { 00:24:29.565 "name": "BaseBdev2", 00:24:29.565 "uuid": "bbee81a0-721c-500d-8375-c9cd7c6467f8", 00:24:29.565 "is_configured": true, 00:24:29.565 "data_offset": 2048, 00:24:29.565 "data_size": 63488 00:24:29.565 }, 00:24:29.565 { 00:24:29.565 "name": "BaseBdev3", 00:24:29.565 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:29.565 "is_configured": true, 00:24:29.565 "data_offset": 2048, 00:24:29.565 "data_size": 63488 00:24:29.565 }, 00:24:29.565 { 00:24:29.565 "name": "BaseBdev4", 00:24:29.565 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:29.565 "is_configured": true, 00:24:29.565 "data_offset": 2048, 00:24:29.565 "data_size": 63488 00:24:29.565 } 00:24:29.565 ] 00:24:29.565 }' 00:24:29.565 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.565 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.565 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:29.825 [2024-07-25 13:33:10.553570] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:29.825 [2024-07-25 13:33:10.570183] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:29.825 [2024-07-25 13:33:10.570221] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:29.825 [2024-07-25 13:33:10.570233] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:29.825 [2024-07-25 13:33:10.570237] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.825 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.086 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:30.086 "name": "raid_bdev1", 00:24:30.086 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:30.086 "strip_size_kb": 0, 00:24:30.086 "state": "online", 00:24:30.086 "raid_level": "raid1", 00:24:30.086 "superblock": true, 00:24:30.086 "num_base_bdevs": 4, 00:24:30.086 "num_base_bdevs_discovered": 3, 00:24:30.086 "num_base_bdevs_operational": 3, 00:24:30.086 "base_bdevs_list": [ 00:24:30.086 { 00:24:30.086 "name": null, 00:24:30.086 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.086 "is_configured": false, 00:24:30.086 "data_offset": 2048, 00:24:30.086 "data_size": 63488 00:24:30.086 }, 00:24:30.086 { 00:24:30.086 "name": "BaseBdev2", 00:24:30.086 "uuid": "bbee81a0-721c-500d-8375-c9cd7c6467f8", 00:24:30.086 "is_configured": true, 00:24:30.086 "data_offset": 2048, 00:24:30.086 "data_size": 63488 00:24:30.086 }, 00:24:30.086 { 00:24:30.086 "name": "BaseBdev3", 00:24:30.086 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:30.086 "is_configured": true, 00:24:30.086 "data_offset": 2048, 00:24:30.086 "data_size": 63488 00:24:30.086 }, 00:24:30.086 { 00:24:30.086 "name": "BaseBdev4", 00:24:30.086 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:30.086 "is_configured": true, 00:24:30.086 "data_offset": 2048, 00:24:30.086 "data_size": 63488 00:24:30.086 } 00:24:30.086 ] 00:24:30.086 }' 00:24:30.086 13:33:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:30.086 13:33:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:30.656 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:30.656 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:30.656 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:30.656 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:30.656 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:30.656 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.656 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:30.916 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:30.916 "name": "raid_bdev1", 00:24:30.916 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:30.916 "strip_size_kb": 0, 00:24:30.916 "state": "online", 00:24:30.916 "raid_level": "raid1", 00:24:30.916 "superblock": true, 00:24:30.916 "num_base_bdevs": 4, 00:24:30.916 "num_base_bdevs_discovered": 3, 00:24:30.916 "num_base_bdevs_operational": 3, 00:24:30.916 "base_bdevs_list": [ 00:24:30.916 { 00:24:30.916 "name": null, 00:24:30.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.916 "is_configured": false, 00:24:30.916 "data_offset": 2048, 00:24:30.916 "data_size": 63488 00:24:30.916 }, 00:24:30.916 { 00:24:30.916 "name": "BaseBdev2", 00:24:30.916 "uuid": "bbee81a0-721c-500d-8375-c9cd7c6467f8", 00:24:30.916 "is_configured": true, 00:24:30.916 "data_offset": 2048, 00:24:30.916 "data_size": 63488 00:24:30.916 }, 00:24:30.916 { 00:24:30.916 "name": "BaseBdev3", 00:24:30.916 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:30.916 "is_configured": true, 00:24:30.916 "data_offset": 2048, 00:24:30.916 "data_size": 63488 00:24:30.916 }, 00:24:30.916 { 00:24:30.916 "name": "BaseBdev4", 00:24:30.916 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:30.916 "is_configured": true, 00:24:30.916 "data_offset": 2048, 00:24:30.916 "data_size": 63488 00:24:30.916 } 00:24:30.916 ] 00:24:30.916 }' 00:24:30.916 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:30.916 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:30.916 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.917 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:30.917 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:31.177 [2024-07-25 13:33:11.824985] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:31.177 [2024-07-25 13:33:11.827703] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1737a20 00:24:31.177 [2024-07-25 13:33:11.828867] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:31.177 13:33:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:32.118 13:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:32.118 13:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:32.118 13:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:32.118 13:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:32.118 13:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:32.118 13:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.118 13:33:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:32.379 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:32.379 "name": "raid_bdev1", 00:24:32.379 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:32.379 "strip_size_kb": 0, 00:24:32.379 "state": "online", 00:24:32.379 "raid_level": "raid1", 00:24:32.379 "superblock": true, 00:24:32.379 "num_base_bdevs": 4, 00:24:32.379 "num_base_bdevs_discovered": 4, 00:24:32.379 "num_base_bdevs_operational": 4, 00:24:32.379 "process": { 00:24:32.379 "type": "rebuild", 00:24:32.379 "target": "spare", 00:24:32.379 "progress": { 00:24:32.379 "blocks": 22528, 00:24:32.379 "percent": 35 00:24:32.379 } 00:24:32.379 }, 00:24:32.379 "base_bdevs_list": [ 00:24:32.379 { 00:24:32.379 "name": "spare", 00:24:32.379 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:32.379 "is_configured": true, 00:24:32.379 "data_offset": 2048, 00:24:32.380 "data_size": 63488 00:24:32.380 }, 00:24:32.380 { 00:24:32.380 "name": "BaseBdev2", 00:24:32.380 "uuid": "bbee81a0-721c-500d-8375-c9cd7c6467f8", 00:24:32.380 "is_configured": true, 00:24:32.380 "data_offset": 2048, 00:24:32.380 "data_size": 63488 00:24:32.380 }, 00:24:32.380 { 00:24:32.380 "name": "BaseBdev3", 00:24:32.380 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:32.380 "is_configured": true, 00:24:32.380 "data_offset": 2048, 00:24:32.380 "data_size": 63488 00:24:32.380 }, 00:24:32.380 { 00:24:32.380 "name": "BaseBdev4", 00:24:32.380 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:32.380 "is_configured": true, 00:24:32.380 "data_offset": 2048, 00:24:32.380 "data_size": 63488 00:24:32.380 } 00:24:32.380 ] 00:24:32.380 }' 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:24:32.380 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:24:32.380 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:32.950 [2024-07-25 13:33:13.642643] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:33.209 [2024-07-25 13:33:13.840344] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1737a20 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.209 13:33:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.469 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.469 "name": "raid_bdev1", 00:24:33.469 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:33.469 "strip_size_kb": 0, 00:24:33.469 "state": "online", 00:24:33.469 "raid_level": "raid1", 00:24:33.470 "superblock": true, 00:24:33.470 "num_base_bdevs": 4, 00:24:33.470 "num_base_bdevs_discovered": 3, 00:24:33.470 "num_base_bdevs_operational": 3, 00:24:33.470 "process": { 00:24:33.470 "type": "rebuild", 00:24:33.470 "target": "spare", 00:24:33.470 "progress": { 00:24:33.470 "blocks": 43008, 00:24:33.470 "percent": 67 00:24:33.470 } 00:24:33.470 }, 00:24:33.470 "base_bdevs_list": [ 00:24:33.470 { 00:24:33.470 "name": "spare", 00:24:33.470 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:33.470 "is_configured": true, 00:24:33.470 "data_offset": 2048, 00:24:33.470 "data_size": 63488 00:24:33.470 }, 00:24:33.470 { 00:24:33.470 "name": null, 00:24:33.470 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.470 "is_configured": false, 00:24:33.470 "data_offset": 2048, 00:24:33.470 "data_size": 63488 00:24:33.470 }, 00:24:33.470 { 00:24:33.470 "name": "BaseBdev3", 00:24:33.470 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:33.470 "is_configured": true, 00:24:33.470 "data_offset": 2048, 00:24:33.470 "data_size": 63488 00:24:33.470 }, 00:24:33.470 { 00:24:33.470 "name": "BaseBdev4", 00:24:33.470 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:33.470 "is_configured": true, 00:24:33.470 "data_offset": 2048, 00:24:33.470 "data_size": 63488 00:24:33.470 } 00:24:33.470 ] 00:24:33.470 }' 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # local timeout=851 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.470 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.731 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.731 "name": "raid_bdev1", 00:24:33.731 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:33.731 "strip_size_kb": 0, 00:24:33.731 "state": "online", 00:24:33.731 "raid_level": "raid1", 00:24:33.731 "superblock": true, 00:24:33.731 "num_base_bdevs": 4, 00:24:33.731 "num_base_bdevs_discovered": 3, 00:24:33.731 "num_base_bdevs_operational": 3, 00:24:33.731 "process": { 00:24:33.731 "type": "rebuild", 00:24:33.731 "target": "spare", 00:24:33.731 "progress": { 00:24:33.731 "blocks": 47104, 00:24:33.731 "percent": 74 00:24:33.731 } 00:24:33.731 }, 00:24:33.731 "base_bdevs_list": [ 00:24:33.731 { 00:24:33.731 "name": "spare", 00:24:33.731 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:33.731 "is_configured": true, 00:24:33.731 "data_offset": 2048, 00:24:33.731 "data_size": 63488 00:24:33.731 }, 00:24:33.731 { 00:24:33.731 "name": null, 00:24:33.731 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.731 "is_configured": false, 00:24:33.731 "data_offset": 2048, 00:24:33.731 "data_size": 63488 00:24:33.731 }, 00:24:33.731 { 00:24:33.731 "name": "BaseBdev3", 00:24:33.731 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:33.731 "is_configured": true, 00:24:33.731 "data_offset": 2048, 00:24:33.731 "data_size": 63488 00:24:33.731 }, 00:24:33.731 { 00:24:33.731 "name": "BaseBdev4", 00:24:33.731 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:33.731 "is_configured": true, 00:24:33.731 "data_offset": 2048, 00:24:33.731 "data_size": 63488 00:24:33.731 } 00:24:33.731 ] 00:24:33.731 }' 00:24:33.731 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.731 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:33.731 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.731 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:33.731 13:33:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@726 -- # sleep 1 00:24:34.301 [2024-07-25 13:33:15.047338] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:34.301 [2024-07-25 13:33:15.047383] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:34.301 [2024-07-25 13:33:15.047459] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:34.870 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:24:34.870 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:34.871 "name": "raid_bdev1", 00:24:34.871 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:34.871 "strip_size_kb": 0, 00:24:34.871 "state": "online", 00:24:34.871 "raid_level": "raid1", 00:24:34.871 "superblock": true, 00:24:34.871 "num_base_bdevs": 4, 00:24:34.871 "num_base_bdevs_discovered": 3, 00:24:34.871 "num_base_bdevs_operational": 3, 00:24:34.871 "base_bdevs_list": [ 00:24:34.871 { 00:24:34.871 "name": "spare", 00:24:34.871 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:34.871 "is_configured": true, 00:24:34.871 "data_offset": 2048, 00:24:34.871 "data_size": 63488 00:24:34.871 }, 00:24:34.871 { 00:24:34.871 "name": null, 00:24:34.871 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:34.871 "is_configured": false, 00:24:34.871 "data_offset": 2048, 00:24:34.871 "data_size": 63488 00:24:34.871 }, 00:24:34.871 { 00:24:34.871 "name": "BaseBdev3", 00:24:34.871 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:34.871 "is_configured": true, 00:24:34.871 "data_offset": 2048, 00:24:34.871 "data_size": 63488 00:24:34.871 }, 00:24:34.871 { 00:24:34.871 "name": "BaseBdev4", 00:24:34.871 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:34.871 "is_configured": true, 00:24:34.871 "data_offset": 2048, 00:24:34.871 "data_size": 63488 00:24:34.871 } 00:24:34.871 ] 00:24:34.871 }' 00:24:34.871 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@724 -- # break 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.131 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.391 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:35.391 "name": "raid_bdev1", 00:24:35.391 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:35.391 "strip_size_kb": 0, 00:24:35.391 "state": "online", 00:24:35.391 "raid_level": "raid1", 00:24:35.391 "superblock": true, 00:24:35.391 "num_base_bdevs": 4, 00:24:35.391 "num_base_bdevs_discovered": 3, 00:24:35.391 "num_base_bdevs_operational": 3, 00:24:35.391 "base_bdevs_list": [ 00:24:35.391 { 00:24:35.391 "name": "spare", 00:24:35.391 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:35.391 "is_configured": true, 00:24:35.391 "data_offset": 2048, 00:24:35.391 "data_size": 63488 00:24:35.391 }, 00:24:35.391 { 00:24:35.391 "name": null, 00:24:35.391 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.391 "is_configured": false, 00:24:35.391 "data_offset": 2048, 00:24:35.391 "data_size": 63488 00:24:35.391 }, 00:24:35.391 { 00:24:35.391 "name": "BaseBdev3", 00:24:35.391 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:35.391 "is_configured": true, 00:24:35.391 "data_offset": 2048, 00:24:35.391 "data_size": 63488 00:24:35.391 }, 00:24:35.391 { 00:24:35.391 "name": "BaseBdev4", 00:24:35.391 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:35.391 "is_configured": true, 00:24:35.391 "data_offset": 2048, 00:24:35.391 "data_size": 63488 00:24:35.391 } 00:24:35.391 ] 00:24:35.391 }' 00:24:35.391 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:35.391 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:35.391 13:33:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:35.391 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:35.687 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:35.687 "name": "raid_bdev1", 00:24:35.687 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:35.687 "strip_size_kb": 0, 00:24:35.687 "state": "online", 00:24:35.687 "raid_level": "raid1", 00:24:35.687 "superblock": true, 00:24:35.687 "num_base_bdevs": 4, 00:24:35.687 "num_base_bdevs_discovered": 3, 00:24:35.687 "num_base_bdevs_operational": 3, 00:24:35.687 "base_bdevs_list": [ 00:24:35.687 { 00:24:35.687 "name": "spare", 00:24:35.687 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:35.687 "is_configured": true, 00:24:35.687 "data_offset": 2048, 00:24:35.687 "data_size": 63488 00:24:35.687 }, 00:24:35.687 { 00:24:35.687 "name": null, 00:24:35.687 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:35.687 "is_configured": false, 00:24:35.687 "data_offset": 2048, 00:24:35.687 "data_size": 63488 00:24:35.687 }, 00:24:35.687 { 00:24:35.687 "name": "BaseBdev3", 00:24:35.687 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:35.687 "is_configured": true, 00:24:35.687 "data_offset": 2048, 00:24:35.687 "data_size": 63488 00:24:35.687 }, 00:24:35.687 { 00:24:35.687 "name": "BaseBdev4", 00:24:35.687 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:35.687 "is_configured": true, 00:24:35.687 "data_offset": 2048, 00:24:35.687 "data_size": 63488 00:24:35.687 } 00:24:35.687 ] 00:24:35.687 }' 00:24:35.687 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:35.687 13:33:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:35.947 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:36.208 [2024-07-25 13:33:16.892141] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:36.208 [2024-07-25 13:33:16.892158] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:36.208 [2024-07-25 13:33:16.892199] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:36.208 [2024-07-25 13:33:16.892251] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:36.208 [2024-07-25 13:33:16.892258] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1737fe0 name raid_bdev1, state offline 00:24:36.208 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:36.208 13:33:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # jq length 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:36.468 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:36.729 /dev/nbd0 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:36.729 1+0 records in 00:24:36.729 1+0 records out 00:24:36.729 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000200886 s, 20.4 MB/s 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:36.729 /dev/nbd1 00:24:36.729 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@869 -- # local i 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@873 -- # break 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:36.989 1+0 records in 00:24:36.989 1+0 records out 00:24:36.989 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000173277 s, 23.6 MB/s 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # size=4096 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@889 -- # return 0 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:36.989 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:37.249 13:33:17 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:37.249 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:37.249 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:37.249 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:37.250 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:37.250 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:37.250 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:37.250 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:37.250 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:24:37.250 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:37.509 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:37.768 [2024-07-25 13:33:18.449925] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:37.768 [2024-07-25 13:33:18.449958] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:37.768 [2024-07-25 13:33:18.449971] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18de050 00:24:37.768 [2024-07-25 13:33:18.449978] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:37.768 [2024-07-25 13:33:18.451287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:37.768 [2024-07-25 13:33:18.451309] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:37.768 [2024-07-25 13:33:18.451372] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:37.768 [2024-07-25 13:33:18.451392] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:37.768 [2024-07-25 13:33:18.451478] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:37.768 [2024-07-25 13:33:18.451533] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:37.768 spare 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:37.769 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:37.769 [2024-07-25 13:33:18.551832] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x173a930 00:24:37.769 [2024-07-25 13:33:18.551841] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:37.769 [2024-07-25 13:33:18.551988] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17fd920 00:24:37.769 [2024-07-25 13:33:18.552101] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x173a930 00:24:37.769 [2024-07-25 13:33:18.552106] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x173a930 00:24:37.769 [2024-07-25 13:33:18.552179] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:38.029 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:38.029 "name": "raid_bdev1", 00:24:38.029 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:38.029 "strip_size_kb": 0, 00:24:38.029 "state": "online", 00:24:38.029 "raid_level": "raid1", 00:24:38.029 "superblock": true, 00:24:38.029 "num_base_bdevs": 4, 00:24:38.029 "num_base_bdevs_discovered": 3, 00:24:38.029 "num_base_bdevs_operational": 3, 00:24:38.029 "base_bdevs_list": [ 00:24:38.029 { 00:24:38.029 "name": "spare", 00:24:38.029 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:38.029 "is_configured": true, 00:24:38.029 "data_offset": 2048, 00:24:38.029 "data_size": 63488 00:24:38.029 }, 00:24:38.029 { 00:24:38.029 "name": null, 00:24:38.029 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.029 "is_configured": false, 00:24:38.029 "data_offset": 2048, 00:24:38.029 "data_size": 63488 00:24:38.029 }, 00:24:38.029 { 00:24:38.029 "name": "BaseBdev3", 00:24:38.029 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:38.029 "is_configured": true, 00:24:38.029 "data_offset": 2048, 00:24:38.029 "data_size": 63488 00:24:38.029 }, 00:24:38.029 { 00:24:38.029 "name": "BaseBdev4", 00:24:38.029 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:38.029 "is_configured": true, 00:24:38.029 "data_offset": 2048, 00:24:38.029 "data_size": 63488 00:24:38.029 } 00:24:38.029 ] 00:24:38.029 }' 00:24:38.029 13:33:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:38.029 13:33:18 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:38.599 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:38.599 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:38.599 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:38.599 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:38.599 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:38.599 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.599 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:38.859 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:38.859 "name": "raid_bdev1", 00:24:38.859 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:38.859 "strip_size_kb": 0, 00:24:38.859 "state": "online", 00:24:38.859 "raid_level": "raid1", 00:24:38.859 "superblock": true, 00:24:38.859 "num_base_bdevs": 4, 00:24:38.859 "num_base_bdevs_discovered": 3, 00:24:38.859 "num_base_bdevs_operational": 3, 00:24:38.859 "base_bdevs_list": [ 00:24:38.859 { 00:24:38.859 "name": "spare", 00:24:38.859 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:38.859 "is_configured": true, 00:24:38.859 "data_offset": 2048, 00:24:38.859 "data_size": 63488 00:24:38.859 }, 00:24:38.859 { 00:24:38.859 "name": null, 00:24:38.859 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:38.859 "is_configured": false, 00:24:38.859 "data_offset": 2048, 00:24:38.859 "data_size": 63488 00:24:38.859 }, 00:24:38.859 { 00:24:38.859 "name": "BaseBdev3", 00:24:38.859 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:38.859 "is_configured": true, 00:24:38.859 "data_offset": 2048, 00:24:38.859 "data_size": 63488 00:24:38.859 }, 00:24:38.859 { 00:24:38.859 "name": "BaseBdev4", 00:24:38.859 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:38.859 "is_configured": true, 00:24:38.859 "data_offset": 2048, 00:24:38.859 "data_size": 63488 00:24:38.859 } 00:24:38.859 ] 00:24:38.859 }' 00:24:38.859 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:38.859 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:38.859 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:38.859 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:38.859 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:38.859 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:39.119 [2024-07-25 13:33:19.881625] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.119 13:33:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.380 13:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.380 "name": "raid_bdev1", 00:24:39.380 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:39.380 "strip_size_kb": 0, 00:24:39.380 "state": "online", 00:24:39.380 "raid_level": "raid1", 00:24:39.380 "superblock": true, 00:24:39.380 "num_base_bdevs": 4, 00:24:39.380 "num_base_bdevs_discovered": 2, 00:24:39.380 "num_base_bdevs_operational": 2, 00:24:39.380 "base_bdevs_list": [ 00:24:39.380 { 00:24:39.380 "name": null, 00:24:39.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.380 "is_configured": false, 00:24:39.380 "data_offset": 2048, 00:24:39.380 "data_size": 63488 00:24:39.380 }, 00:24:39.380 { 00:24:39.380 "name": null, 00:24:39.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:39.380 "is_configured": false, 00:24:39.380 "data_offset": 2048, 00:24:39.380 "data_size": 63488 00:24:39.380 }, 00:24:39.380 { 00:24:39.380 "name": "BaseBdev3", 00:24:39.380 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:39.380 "is_configured": true, 00:24:39.380 "data_offset": 2048, 00:24:39.380 "data_size": 63488 00:24:39.380 }, 00:24:39.380 { 00:24:39.380 "name": "BaseBdev4", 00:24:39.380 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:39.380 "is_configured": true, 00:24:39.380 "data_offset": 2048, 00:24:39.380 "data_size": 63488 00:24:39.380 } 00:24:39.380 ] 00:24:39.380 }' 00:24:39.380 13:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.380 13:33:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:39.950 13:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:40.210 [2024-07-25 13:33:20.775913] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:40.210 [2024-07-25 13:33:20.776028] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:40.210 [2024-07-25 13:33:20.776037] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:40.210 [2024-07-25 13:33:20.776055] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:40.210 [2024-07-25 13:33:20.778647] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18e15b0 00:24:40.210 [2024-07-25 13:33:20.779727] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:40.210 13:33:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # sleep 1 00:24:41.149 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:41.149 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:41.149 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:41.149 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:41.149 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:41.149 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.149 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.409 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:41.409 "name": "raid_bdev1", 00:24:41.409 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:41.409 "strip_size_kb": 0, 00:24:41.409 "state": "online", 00:24:41.409 "raid_level": "raid1", 00:24:41.409 "superblock": true, 00:24:41.409 "num_base_bdevs": 4, 00:24:41.409 "num_base_bdevs_discovered": 3, 00:24:41.409 "num_base_bdevs_operational": 3, 00:24:41.409 "process": { 00:24:41.409 "type": "rebuild", 00:24:41.409 "target": "spare", 00:24:41.409 "progress": { 00:24:41.409 "blocks": 22528, 00:24:41.409 "percent": 35 00:24:41.409 } 00:24:41.409 }, 00:24:41.409 "base_bdevs_list": [ 00:24:41.409 { 00:24:41.409 "name": "spare", 00:24:41.409 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:41.409 "is_configured": true, 00:24:41.409 "data_offset": 2048, 00:24:41.409 "data_size": 63488 00:24:41.409 }, 00:24:41.409 { 00:24:41.409 "name": null, 00:24:41.409 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.409 "is_configured": false, 00:24:41.409 "data_offset": 2048, 00:24:41.409 "data_size": 63488 00:24:41.409 }, 00:24:41.409 { 00:24:41.409 "name": "BaseBdev3", 00:24:41.409 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:41.409 "is_configured": true, 00:24:41.409 "data_offset": 2048, 00:24:41.409 "data_size": 63488 00:24:41.409 }, 00:24:41.409 { 00:24:41.409 "name": "BaseBdev4", 00:24:41.409 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:41.409 "is_configured": true, 00:24:41.409 "data_offset": 2048, 00:24:41.409 "data_size": 63488 00:24:41.409 } 00:24:41.409 ] 00:24:41.409 }' 00:24:41.409 13:33:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:41.409 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:41.409 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:41.409 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:41.409 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:41.670 [2024-07-25 13:33:22.264572] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:41.670 [2024-07-25 13:33:22.288597] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:41.670 [2024-07-25 13:33:22.288626] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:41.670 [2024-07-25 13:33:22.288636] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:41.670 [2024-07-25 13:33:22.288640] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.670 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.930 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.930 "name": "raid_bdev1", 00:24:41.930 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:41.930 "strip_size_kb": 0, 00:24:41.930 "state": "online", 00:24:41.930 "raid_level": "raid1", 00:24:41.930 "superblock": true, 00:24:41.930 "num_base_bdevs": 4, 00:24:41.930 "num_base_bdevs_discovered": 2, 00:24:41.930 "num_base_bdevs_operational": 2, 00:24:41.930 "base_bdevs_list": [ 00:24:41.930 { 00:24:41.930 "name": null, 00:24:41.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.930 "is_configured": false, 00:24:41.930 "data_offset": 2048, 00:24:41.930 "data_size": 63488 00:24:41.930 }, 00:24:41.930 { 00:24:41.930 "name": null, 00:24:41.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.930 "is_configured": false, 00:24:41.930 "data_offset": 2048, 00:24:41.930 "data_size": 63488 00:24:41.930 }, 00:24:41.930 { 00:24:41.930 "name": "BaseBdev3", 00:24:41.930 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:41.930 "is_configured": true, 00:24:41.930 "data_offset": 2048, 00:24:41.930 "data_size": 63488 00:24:41.930 }, 00:24:41.930 { 00:24:41.930 "name": "BaseBdev4", 00:24:41.930 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:41.930 "is_configured": true, 00:24:41.930 "data_offset": 2048, 00:24:41.930 "data_size": 63488 00:24:41.930 } 00:24:41.930 ] 00:24:41.930 }' 00:24:41.930 13:33:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.930 13:33:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:42.501 13:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:42.501 [2024-07-25 13:33:23.190889] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:42.501 [2024-07-25 13:33:23.190924] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:42.501 [2024-07-25 13:33:23.190937] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1733540 00:24:42.501 [2024-07-25 13:33:23.190943] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:42.501 [2024-07-25 13:33:23.191250] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:42.501 [2024-07-25 13:33:23.191261] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:42.501 [2024-07-25 13:33:23.191320] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:42.501 [2024-07-25 13:33:23.191327] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:42.501 [2024-07-25 13:33:23.191338] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:42.501 [2024-07-25 13:33:23.191350] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:42.501 [2024-07-25 13:33:23.194030] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173af70 00:24:42.501 [2024-07-25 13:33:23.195111] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:42.501 spare 00:24:42.501 13:33:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # sleep 1 00:24:43.442 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:43.442 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:43.442 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:43.442 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:43.442 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:43.442 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.442 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.702 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.702 "name": "raid_bdev1", 00:24:43.702 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:43.702 "strip_size_kb": 0, 00:24:43.702 "state": "online", 00:24:43.702 "raid_level": "raid1", 00:24:43.702 "superblock": true, 00:24:43.702 "num_base_bdevs": 4, 00:24:43.702 "num_base_bdevs_discovered": 3, 00:24:43.702 "num_base_bdevs_operational": 3, 00:24:43.702 "process": { 00:24:43.702 "type": "rebuild", 00:24:43.702 "target": "spare", 00:24:43.702 "progress": { 00:24:43.702 "blocks": 22528, 00:24:43.702 "percent": 35 00:24:43.702 } 00:24:43.702 }, 00:24:43.702 "base_bdevs_list": [ 00:24:43.702 { 00:24:43.702 "name": "spare", 00:24:43.702 "uuid": "bb4c7c53-8711-5922-ac66-1eaf7a44f42e", 00:24:43.702 "is_configured": true, 00:24:43.702 "data_offset": 2048, 00:24:43.702 "data_size": 63488 00:24:43.702 }, 00:24:43.702 { 00:24:43.702 "name": null, 00:24:43.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:43.702 "is_configured": false, 00:24:43.702 "data_offset": 2048, 00:24:43.702 "data_size": 63488 00:24:43.702 }, 00:24:43.702 { 00:24:43.702 "name": "BaseBdev3", 00:24:43.702 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:43.702 "is_configured": true, 00:24:43.702 "data_offset": 2048, 00:24:43.702 "data_size": 63488 00:24:43.702 }, 00:24:43.702 { 00:24:43.702 "name": "BaseBdev4", 00:24:43.702 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:43.702 "is_configured": true, 00:24:43.702 "data_offset": 2048, 00:24:43.702 "data_size": 63488 00:24:43.702 } 00:24:43.702 ] 00:24:43.702 }' 00:24:43.702 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.702 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:43.702 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.702 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:43.702 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:43.962 [2024-07-25 13:33:24.655887] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:43.962 [2024-07-25 13:33:24.703976] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:43.962 [2024-07-25 13:33:24.704006] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:43.962 [2024-07-25 13:33:24.704017] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:43.962 [2024-07-25 13:33:24.704021] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.962 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.222 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.222 "name": "raid_bdev1", 00:24:44.222 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:44.222 "strip_size_kb": 0, 00:24:44.222 "state": "online", 00:24:44.222 "raid_level": "raid1", 00:24:44.222 "superblock": true, 00:24:44.222 "num_base_bdevs": 4, 00:24:44.222 "num_base_bdevs_discovered": 2, 00:24:44.222 "num_base_bdevs_operational": 2, 00:24:44.222 "base_bdevs_list": [ 00:24:44.222 { 00:24:44.222 "name": null, 00:24:44.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.222 "is_configured": false, 00:24:44.222 "data_offset": 2048, 00:24:44.222 "data_size": 63488 00:24:44.222 }, 00:24:44.222 { 00:24:44.222 "name": null, 00:24:44.222 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.222 "is_configured": false, 00:24:44.222 "data_offset": 2048, 00:24:44.222 "data_size": 63488 00:24:44.222 }, 00:24:44.222 { 00:24:44.222 "name": "BaseBdev3", 00:24:44.222 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:44.222 "is_configured": true, 00:24:44.222 "data_offset": 2048, 00:24:44.222 "data_size": 63488 00:24:44.222 }, 00:24:44.222 { 00:24:44.222 "name": "BaseBdev4", 00:24:44.222 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:44.222 "is_configured": true, 00:24:44.222 "data_offset": 2048, 00:24:44.222 "data_size": 63488 00:24:44.222 } 00:24:44.222 ] 00:24:44.222 }' 00:24:44.222 13:33:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.222 13:33:24 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:44.790 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:44.790 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:44.790 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:44.790 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:44.790 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:44.790 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.790 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.051 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.051 "name": "raid_bdev1", 00:24:45.051 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:45.051 "strip_size_kb": 0, 00:24:45.051 "state": "online", 00:24:45.051 "raid_level": "raid1", 00:24:45.051 "superblock": true, 00:24:45.051 "num_base_bdevs": 4, 00:24:45.051 "num_base_bdevs_discovered": 2, 00:24:45.051 "num_base_bdevs_operational": 2, 00:24:45.051 "base_bdevs_list": [ 00:24:45.051 { 00:24:45.051 "name": null, 00:24:45.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.051 "is_configured": false, 00:24:45.051 "data_offset": 2048, 00:24:45.051 "data_size": 63488 00:24:45.051 }, 00:24:45.051 { 00:24:45.052 "name": null, 00:24:45.052 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.052 "is_configured": false, 00:24:45.052 "data_offset": 2048, 00:24:45.052 "data_size": 63488 00:24:45.052 }, 00:24:45.052 { 00:24:45.052 "name": "BaseBdev3", 00:24:45.052 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:45.052 "is_configured": true, 00:24:45.052 "data_offset": 2048, 00:24:45.052 "data_size": 63488 00:24:45.052 }, 00:24:45.052 { 00:24:45.052 "name": "BaseBdev4", 00:24:45.052 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:45.052 "is_configured": true, 00:24:45.052 "data_offset": 2048, 00:24:45.052 "data_size": 63488 00:24:45.052 } 00:24:45.052 ] 00:24:45.052 }' 00:24:45.052 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.052 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.052 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.052 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.052 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:45.351 13:33:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:45.351 [2024-07-25 13:33:26.079482] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:45.351 [2024-07-25 13:33:26.079511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:45.351 [2024-07-25 13:33:26.079525] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x173bf40 00:24:45.351 [2024-07-25 13:33:26.079531] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:45.351 [2024-07-25 13:33:26.079808] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:45.351 [2024-07-25 13:33:26.079820] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:45.351 [2024-07-25 13:33:26.079866] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:45.351 [2024-07-25 13:33:26.079874] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:45.351 [2024-07-25 13:33:26.079880] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:45.351 BaseBdev1 00:24:45.351 13:33:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@789 -- # sleep 1 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.323 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:46.583 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:46.583 "name": "raid_bdev1", 00:24:46.583 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:46.583 "strip_size_kb": 0, 00:24:46.583 "state": "online", 00:24:46.583 "raid_level": "raid1", 00:24:46.583 "superblock": true, 00:24:46.583 "num_base_bdevs": 4, 00:24:46.583 "num_base_bdevs_discovered": 2, 00:24:46.583 "num_base_bdevs_operational": 2, 00:24:46.583 "base_bdevs_list": [ 00:24:46.583 { 00:24:46.583 "name": null, 00:24:46.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.583 "is_configured": false, 00:24:46.583 "data_offset": 2048, 00:24:46.583 "data_size": 63488 00:24:46.583 }, 00:24:46.583 { 00:24:46.583 "name": null, 00:24:46.583 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:46.583 "is_configured": false, 00:24:46.583 "data_offset": 2048, 00:24:46.583 "data_size": 63488 00:24:46.583 }, 00:24:46.583 { 00:24:46.583 "name": "BaseBdev3", 00:24:46.583 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:46.583 "is_configured": true, 00:24:46.583 "data_offset": 2048, 00:24:46.583 "data_size": 63488 00:24:46.583 }, 00:24:46.583 { 00:24:46.583 "name": "BaseBdev4", 00:24:46.583 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:46.583 "is_configured": true, 00:24:46.583 "data_offset": 2048, 00:24:46.583 "data_size": 63488 00:24:46.583 } 00:24:46.583 ] 00:24:46.583 }' 00:24:46.583 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:46.583 13:33:27 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:47.152 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:47.152 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.152 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:47.152 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:47.152 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.152 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.152 13:33:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.413 "name": "raid_bdev1", 00:24:47.413 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:47.413 "strip_size_kb": 0, 00:24:47.413 "state": "online", 00:24:47.413 "raid_level": "raid1", 00:24:47.413 "superblock": true, 00:24:47.413 "num_base_bdevs": 4, 00:24:47.413 "num_base_bdevs_discovered": 2, 00:24:47.413 "num_base_bdevs_operational": 2, 00:24:47.413 "base_bdevs_list": [ 00:24:47.413 { 00:24:47.413 "name": null, 00:24:47.413 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.413 "is_configured": false, 00:24:47.413 "data_offset": 2048, 00:24:47.413 "data_size": 63488 00:24:47.413 }, 00:24:47.413 { 00:24:47.413 "name": null, 00:24:47.413 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:47.413 "is_configured": false, 00:24:47.413 "data_offset": 2048, 00:24:47.413 "data_size": 63488 00:24:47.413 }, 00:24:47.413 { 00:24:47.413 "name": "BaseBdev3", 00:24:47.413 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:47.413 "is_configured": true, 00:24:47.413 "data_offset": 2048, 00:24:47.413 "data_size": 63488 00:24:47.413 }, 00:24:47.413 { 00:24:47.413 "name": "BaseBdev4", 00:24:47.413 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:47.413 "is_configured": true, 00:24:47.413 "data_offset": 2048, 00:24:47.413 "data_size": 63488 00:24:47.413 } 00:24:47.413 ] 00:24:47.413 }' 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@650 -- # local es=0 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:47.413 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:47.674 [2024-07-25 13:33:28.297212] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:47.674 [2024-07-25 13:33:28.297302] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:47.674 [2024-07-25 13:33:28.297310] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:47.674 request: 00:24:47.674 { 00:24:47.674 "base_bdev": "BaseBdev1", 00:24:47.674 "raid_bdev": "raid_bdev1", 00:24:47.674 "method": "bdev_raid_add_base_bdev", 00:24:47.674 "req_id": 1 00:24:47.674 } 00:24:47.674 Got JSON-RPC error response 00:24:47.674 response: 00:24:47.674 { 00:24:47.674 "code": -22, 00:24:47.674 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:47.674 } 00:24:47.674 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@653 -- # es=1 00:24:47.674 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:24:47.674 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:24:47.674 13:33:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:24:47.674 13:33:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@793 -- # sleep 1 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.616 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.876 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:48.876 "name": "raid_bdev1", 00:24:48.876 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:48.876 "strip_size_kb": 0, 00:24:48.876 "state": "online", 00:24:48.876 "raid_level": "raid1", 00:24:48.876 "superblock": true, 00:24:48.876 "num_base_bdevs": 4, 00:24:48.876 "num_base_bdevs_discovered": 2, 00:24:48.876 "num_base_bdevs_operational": 2, 00:24:48.876 "base_bdevs_list": [ 00:24:48.876 { 00:24:48.876 "name": null, 00:24:48.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.876 "is_configured": false, 00:24:48.876 "data_offset": 2048, 00:24:48.876 "data_size": 63488 00:24:48.876 }, 00:24:48.876 { 00:24:48.876 "name": null, 00:24:48.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.876 "is_configured": false, 00:24:48.876 "data_offset": 2048, 00:24:48.876 "data_size": 63488 00:24:48.876 }, 00:24:48.876 { 00:24:48.876 "name": "BaseBdev3", 00:24:48.876 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:48.876 "is_configured": true, 00:24:48.876 "data_offset": 2048, 00:24:48.876 "data_size": 63488 00:24:48.876 }, 00:24:48.876 { 00:24:48.876 "name": "BaseBdev4", 00:24:48.876 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:48.876 "is_configured": true, 00:24:48.876 "data_offset": 2048, 00:24:48.876 "data_size": 63488 00:24:48.876 } 00:24:48.876 ] 00:24:48.876 }' 00:24:48.876 13:33:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:48.876 13:33:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:49.445 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:49.445 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.445 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:49.445 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:49.445 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.445 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.445 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.705 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.705 "name": "raid_bdev1", 00:24:49.705 "uuid": "fef33d0a-7f9b-4041-aee4-b66af775c67d", 00:24:49.705 "strip_size_kb": 0, 00:24:49.705 "state": "online", 00:24:49.705 "raid_level": "raid1", 00:24:49.705 "superblock": true, 00:24:49.705 "num_base_bdevs": 4, 00:24:49.705 "num_base_bdevs_discovered": 2, 00:24:49.705 "num_base_bdevs_operational": 2, 00:24:49.705 "base_bdevs_list": [ 00:24:49.705 { 00:24:49.705 "name": null, 00:24:49.705 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.705 "is_configured": false, 00:24:49.705 "data_offset": 2048, 00:24:49.705 "data_size": 63488 00:24:49.705 }, 00:24:49.706 { 00:24:49.706 "name": null, 00:24:49.706 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.706 "is_configured": false, 00:24:49.706 "data_offset": 2048, 00:24:49.706 "data_size": 63488 00:24:49.706 }, 00:24:49.706 { 00:24:49.706 "name": "BaseBdev3", 00:24:49.706 "uuid": "7049d2c8-f075-5a11-b8f9-9877fa8f9a40", 00:24:49.706 "is_configured": true, 00:24:49.706 "data_offset": 2048, 00:24:49.706 "data_size": 63488 00:24:49.706 }, 00:24:49.706 { 00:24:49.706 "name": "BaseBdev4", 00:24:49.706 "uuid": "752a2e7f-0f63-5da2-9020-50a6dae14fbc", 00:24:49.706 "is_configured": true, 00:24:49.706 "data_offset": 2048, 00:24:49.706 "data_size": 63488 00:24:49.706 } 00:24:49.706 ] 00:24:49.706 }' 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@798 -- # killprocess 1017793 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@950 -- # '[' -z 1017793 ']' 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # kill -0 1017793 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # uname 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1017793 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1017793' 00:24:49.706 killing process with pid 1017793 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@969 -- # kill 1017793 00:24:49.706 Received shutdown signal, test time was about 60.000000 seconds 00:24:49.706 00:24:49.706 Latency(us) 00:24:49.706 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:49.706 =================================================================================================================== 00:24:49.706 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:49.706 [2024-07-25 13:33:30.406882] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:49.706 [2024-07-25 13:33:30.406949] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:49.706 [2024-07-25 13:33:30.406991] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:49.706 [2024-07-25 13:33:30.406998] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x173a930 name raid_bdev1, state offline 00:24:49.706 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@974 -- # wait 1017793 00:24:49.706 [2024-07-25 13:33:30.432964] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@800 -- # return 0 00:24:49.967 00:24:49.967 real 0m35.653s 00:24:49.967 user 0m51.121s 00:24:49.967 sys 0m5.335s 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1126 -- # xtrace_disable 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:49.967 ************************************ 00:24:49.967 END TEST raid_rebuild_test_sb 00:24:49.967 ************************************ 00:24:49.967 13:33:30 bdev_raid -- bdev/bdev_raid.sh@959 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:24:49.967 13:33:30 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:24:49.967 13:33:30 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:24:49.967 13:33:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:49.967 ************************************ 00:24:49.967 START TEST raid_rebuild_test_io 00:24:49.967 ************************************ 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 false true true 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@586 -- # local superblock=false 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:49.967 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # '[' false = true ']' 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1024647 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1024647 /var/tmp/spdk-raid.sock 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@831 -- # '[' -z 1024647 ']' 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:49.974 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:24:49.974 13:33:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:49.974 [2024-07-25 13:33:30.698182] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:24:49.974 [2024-07-25 13:33:30.698237] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1024647 ] 00:24:49.974 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:49.974 Zero copy mechanism will not be used. 00:24:50.235 [2024-07-25 13:33:30.789590] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:50.235 [2024-07-25 13:33:30.869226] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:24:50.235 [2024-07-25 13:33:30.915048] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:50.235 [2024-07-25 13:33:30.915072] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:50.806 13:33:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:24:50.806 13:33:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@864 -- # return 0 00:24:50.806 13:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:50.806 13:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:51.066 BaseBdev1_malloc 00:24:51.066 13:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:51.325 [2024-07-25 13:33:31.909607] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:51.325 [2024-07-25 13:33:31.909640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.325 [2024-07-25 13:33:31.909654] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc51d10 00:24:51.325 [2024-07-25 13:33:31.909661] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.325 [2024-07-25 13:33:31.910924] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.325 [2024-07-25 13:33:31.910944] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:51.325 BaseBdev1 00:24:51.325 13:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:51.325 13:33:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:51.325 BaseBdev2_malloc 00:24:51.325 13:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:51.585 [2024-07-25 13:33:32.280479] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:51.585 [2024-07-25 13:33:32.280509] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:51.585 [2024-07-25 13:33:32.280523] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc526d0 00:24:51.585 [2024-07-25 13:33:32.280534] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:51.585 [2024-07-25 13:33:32.281711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:51.585 [2024-07-25 13:33:32.281730] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:51.585 BaseBdev2 00:24:51.585 13:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:51.585 13:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:51.845 BaseBdev3_malloc 00:24:51.845 13:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:52.104 [2024-07-25 13:33:32.667315] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:52.104 [2024-07-25 13:33:32.667343] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.104 [2024-07-25 13:33:32.667354] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xd15a30 00:24:52.104 [2024-07-25 13:33:32.667360] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.104 [2024-07-25 13:33:32.668553] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.104 [2024-07-25 13:33:32.668571] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:52.104 BaseBdev3 00:24:52.104 13:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:24:52.104 13:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:52.104 BaseBdev4_malloc 00:24:52.104 13:33:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:52.363 [2024-07-25 13:33:33.050187] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:52.363 [2024-07-25 13:33:33.050213] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.363 [2024-07-25 13:33:33.050225] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4ad60 00:24:52.363 [2024-07-25 13:33:33.050231] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.363 [2024-07-25 13:33:33.051413] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.363 [2024-07-25 13:33:33.051432] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:52.363 BaseBdev4 00:24:52.363 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:52.623 spare_malloc 00:24:52.623 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:52.882 spare_delay 00:24:52.882 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:52.882 [2024-07-25 13:33:33.625629] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:52.882 [2024-07-25 13:33:33.625657] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:52.882 [2024-07-25 13:33:33.625670] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc4c820 00:24:52.882 [2024-07-25 13:33:33.625677] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:52.882 [2024-07-25 13:33:33.626867] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:52.882 [2024-07-25 13:33:33.626885] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:52.882 spare 00:24:52.882 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:53.142 [2024-07-25 13:33:33.818135] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:53.142 [2024-07-25 13:33:33.819141] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:53.142 [2024-07-25 13:33:33.819184] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:53.142 [2024-07-25 13:33:33.819218] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:53.142 [2024-07-25 13:33:33.819279] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0xc4dfe0 00:24:53.142 [2024-07-25 13:33:33.819285] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:53.142 [2024-07-25 13:33:33.819449] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc4c490 00:24:53.142 [2024-07-25 13:33:33.819569] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc4dfe0 00:24:53.142 [2024-07-25 13:33:33.819575] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc4dfe0 00:24:53.142 [2024-07-25 13:33:33.819658] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:53.142 13:33:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.402 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:53.402 "name": "raid_bdev1", 00:24:53.402 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:24:53.402 "strip_size_kb": 0, 00:24:53.402 "state": "online", 00:24:53.402 "raid_level": "raid1", 00:24:53.402 "superblock": false, 00:24:53.402 "num_base_bdevs": 4, 00:24:53.402 "num_base_bdevs_discovered": 4, 00:24:53.402 "num_base_bdevs_operational": 4, 00:24:53.402 "base_bdevs_list": [ 00:24:53.402 { 00:24:53.402 "name": "BaseBdev1", 00:24:53.402 "uuid": "90b7dfcc-9518-56ae-907b-2a90818346ab", 00:24:53.402 "is_configured": true, 00:24:53.402 "data_offset": 0, 00:24:53.402 "data_size": 65536 00:24:53.402 }, 00:24:53.402 { 00:24:53.402 "name": "BaseBdev2", 00:24:53.402 "uuid": "06c5035e-2aa0-592f-a197-cbf89cb07d0a", 00:24:53.402 "is_configured": true, 00:24:53.402 "data_offset": 0, 00:24:53.402 "data_size": 65536 00:24:53.402 }, 00:24:53.403 { 00:24:53.403 "name": "BaseBdev3", 00:24:53.403 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:24:53.403 "is_configured": true, 00:24:53.403 "data_offset": 0, 00:24:53.403 "data_size": 65536 00:24:53.403 }, 00:24:53.403 { 00:24:53.403 "name": "BaseBdev4", 00:24:53.403 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:24:53.403 "is_configured": true, 00:24:53.403 "data_offset": 0, 00:24:53.403 "data_size": 65536 00:24:53.403 } 00:24:53.403 ] 00:24:53.403 }' 00:24:53.403 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:53.403 13:33:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:53.972 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:53.972 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:24:53.972 [2024-07-25 13:33:34.716618] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:53.972 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=65536 00:24:53.972 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:53.972 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:54.232 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@634 -- # data_offset=0 00:24:54.232 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:24:54.232 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:54.232 13:33:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:54.232 [2024-07-25 13:33:35.010517] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdf3c40 00:24:54.232 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:54.232 Zero copy mechanism will not be used. 00:24:54.232 Running I/O for 60 seconds... 00:24:54.491 [2024-07-25 13:33:35.120510] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:54.491 [2024-07-25 13:33:35.126938] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xdf3c40 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.491 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:54.750 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:54.750 "name": "raid_bdev1", 00:24:54.750 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:24:54.750 "strip_size_kb": 0, 00:24:54.750 "state": "online", 00:24:54.750 "raid_level": "raid1", 00:24:54.750 "superblock": false, 00:24:54.750 "num_base_bdevs": 4, 00:24:54.750 "num_base_bdevs_discovered": 3, 00:24:54.750 "num_base_bdevs_operational": 3, 00:24:54.750 "base_bdevs_list": [ 00:24:54.750 { 00:24:54.750 "name": null, 00:24:54.750 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:54.750 "is_configured": false, 00:24:54.750 "data_offset": 0, 00:24:54.750 "data_size": 65536 00:24:54.750 }, 00:24:54.750 { 00:24:54.750 "name": "BaseBdev2", 00:24:54.750 "uuid": "06c5035e-2aa0-592f-a197-cbf89cb07d0a", 00:24:54.750 "is_configured": true, 00:24:54.750 "data_offset": 0, 00:24:54.750 "data_size": 65536 00:24:54.750 }, 00:24:54.750 { 00:24:54.750 "name": "BaseBdev3", 00:24:54.750 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:24:54.750 "is_configured": true, 00:24:54.750 "data_offset": 0, 00:24:54.750 "data_size": 65536 00:24:54.750 }, 00:24:54.750 { 00:24:54.750 "name": "BaseBdev4", 00:24:54.750 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:24:54.750 "is_configured": true, 00:24:54.750 "data_offset": 0, 00:24:54.750 "data_size": 65536 00:24:54.750 } 00:24:54.750 ] 00:24:54.750 }' 00:24:54.750 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:54.750 13:33:35 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:55.320 13:33:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:55.320 [2024-07-25 13:33:36.062768] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:55.580 [2024-07-25 13:33:36.112057] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdf3ee0 00:24:55.580 [2024-07-25 13:33:36.113710] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:55.580 13:33:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:55.580 [2024-07-25 13:33:36.223600] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:55.580 [2024-07-25 13:33:36.223871] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:55.580 [2024-07-25 13:33:36.355369] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:55.580 [2024-07-25 13:33:36.355517] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:56.149 [2024-07-25 13:33:36.747179] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:56.149 [2024-07-25 13:33:36.754079] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:56.409 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:56.409 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:56.409 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:56.409 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:56.409 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:56.409 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.409 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.409 [2024-07-25 13:33:37.199533] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:56.669 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:56.669 "name": "raid_bdev1", 00:24:56.669 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:24:56.669 "strip_size_kb": 0, 00:24:56.669 "state": "online", 00:24:56.669 "raid_level": "raid1", 00:24:56.669 "superblock": false, 00:24:56.669 "num_base_bdevs": 4, 00:24:56.669 "num_base_bdevs_discovered": 4, 00:24:56.669 "num_base_bdevs_operational": 4, 00:24:56.669 "process": { 00:24:56.669 "type": "rebuild", 00:24:56.669 "target": "spare", 00:24:56.669 "progress": { 00:24:56.669 "blocks": 16384, 00:24:56.669 "percent": 25 00:24:56.669 } 00:24:56.669 }, 00:24:56.669 "base_bdevs_list": [ 00:24:56.669 { 00:24:56.669 "name": "spare", 00:24:56.669 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:24:56.669 "is_configured": true, 00:24:56.669 "data_offset": 0, 00:24:56.669 "data_size": 65536 00:24:56.669 }, 00:24:56.669 { 00:24:56.669 "name": "BaseBdev2", 00:24:56.669 "uuid": "06c5035e-2aa0-592f-a197-cbf89cb07d0a", 00:24:56.669 "is_configured": true, 00:24:56.669 "data_offset": 0, 00:24:56.669 "data_size": 65536 00:24:56.669 }, 00:24:56.669 { 00:24:56.669 "name": "BaseBdev3", 00:24:56.669 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:24:56.669 "is_configured": true, 00:24:56.669 "data_offset": 0, 00:24:56.669 "data_size": 65536 00:24:56.669 }, 00:24:56.669 { 00:24:56.669 "name": "BaseBdev4", 00:24:56.669 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:24:56.669 "is_configured": true, 00:24:56.669 "data_offset": 0, 00:24:56.669 "data_size": 65536 00:24:56.669 } 00:24:56.669 ] 00:24:56.669 }' 00:24:56.669 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.669 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:56.669 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:56.669 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.669 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:56.669 [2024-07-25 13:33:37.439640] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:56.669 [2024-07-25 13:33:37.439893] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:56.928 [2024-07-25 13:33:37.566084] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.928 [2024-07-25 13:33:37.569710] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:56.928 [2024-07-25 13:33:37.677935] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:56.928 [2024-07-25 13:33:37.695051] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:56.928 [2024-07-25 13:33:37.695071] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.928 [2024-07-25 13:33:37.695076] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:56.928 [2024-07-25 13:33:37.704945] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xdf3c40 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:57.188 "name": "raid_bdev1", 00:24:57.188 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:24:57.188 "strip_size_kb": 0, 00:24:57.188 "state": "online", 00:24:57.188 "raid_level": "raid1", 00:24:57.188 "superblock": false, 00:24:57.188 "num_base_bdevs": 4, 00:24:57.188 "num_base_bdevs_discovered": 3, 00:24:57.188 "num_base_bdevs_operational": 3, 00:24:57.188 "base_bdevs_list": [ 00:24:57.188 { 00:24:57.188 "name": null, 00:24:57.188 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:57.188 "is_configured": false, 00:24:57.188 "data_offset": 0, 00:24:57.188 "data_size": 65536 00:24:57.188 }, 00:24:57.188 { 00:24:57.188 "name": "BaseBdev2", 00:24:57.188 "uuid": "06c5035e-2aa0-592f-a197-cbf89cb07d0a", 00:24:57.188 "is_configured": true, 00:24:57.188 "data_offset": 0, 00:24:57.188 "data_size": 65536 00:24:57.188 }, 00:24:57.188 { 00:24:57.188 "name": "BaseBdev3", 00:24:57.188 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:24:57.188 "is_configured": true, 00:24:57.188 "data_offset": 0, 00:24:57.188 "data_size": 65536 00:24:57.188 }, 00:24:57.188 { 00:24:57.188 "name": "BaseBdev4", 00:24:57.188 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:24:57.188 "is_configured": true, 00:24:57.188 "data_offset": 0, 00:24:57.188 "data_size": 65536 00:24:57.188 } 00:24:57.188 ] 00:24:57.188 }' 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:57.188 13:33:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:57.757 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:57.757 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:57.757 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:57.757 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:57.757 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:57.757 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:57.757 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:58.017 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:58.017 "name": "raid_bdev1", 00:24:58.017 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:24:58.017 "strip_size_kb": 0, 00:24:58.017 "state": "online", 00:24:58.017 "raid_level": "raid1", 00:24:58.017 "superblock": false, 00:24:58.017 "num_base_bdevs": 4, 00:24:58.017 "num_base_bdevs_discovered": 3, 00:24:58.017 "num_base_bdevs_operational": 3, 00:24:58.017 "base_bdevs_list": [ 00:24:58.017 { 00:24:58.017 "name": null, 00:24:58.017 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:58.017 "is_configured": false, 00:24:58.017 "data_offset": 0, 00:24:58.017 "data_size": 65536 00:24:58.017 }, 00:24:58.017 { 00:24:58.017 "name": "BaseBdev2", 00:24:58.017 "uuid": "06c5035e-2aa0-592f-a197-cbf89cb07d0a", 00:24:58.017 "is_configured": true, 00:24:58.017 "data_offset": 0, 00:24:58.017 "data_size": 65536 00:24:58.017 }, 00:24:58.017 { 00:24:58.017 "name": "BaseBdev3", 00:24:58.017 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:24:58.017 "is_configured": true, 00:24:58.017 "data_offset": 0, 00:24:58.017 "data_size": 65536 00:24:58.017 }, 00:24:58.017 { 00:24:58.017 "name": "BaseBdev4", 00:24:58.017 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:24:58.017 "is_configured": true, 00:24:58.017 "data_offset": 0, 00:24:58.017 "data_size": 65536 00:24:58.017 } 00:24:58.017 ] 00:24:58.017 }' 00:24:58.017 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:58.017 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:58.017 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:58.277 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:58.277 13:33:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:58.277 [2024-07-25 13:33:38.979592] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:58.277 [2024-07-25 13:33:39.022245] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc515d0 00:24:58.277 [2024-07-25 13:33:39.023422] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:58.277 13:33:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:24:58.536 [2024-07-25 13:33:39.138549] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:58.536 [2024-07-25 13:33:39.139316] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:58.796 [2024-07-25 13:33:39.363638] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:58.796 [2024-07-25 13:33:39.363753] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:59.056 [2024-07-25 13:33:39.719361] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:59.056 [2024-07-25 13:33:39.719763] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:59.315 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:59.315 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.315 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:59.315 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:59.315 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.315 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.315 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.315 [2024-07-25 13:33:40.063243] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.575 "name": "raid_bdev1", 00:24:59.575 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:24:59.575 "strip_size_kb": 0, 00:24:59.575 "state": "online", 00:24:59.575 "raid_level": "raid1", 00:24:59.575 "superblock": false, 00:24:59.575 "num_base_bdevs": 4, 00:24:59.575 "num_base_bdevs_discovered": 4, 00:24:59.575 "num_base_bdevs_operational": 4, 00:24:59.575 "process": { 00:24:59.575 "type": "rebuild", 00:24:59.575 "target": "spare", 00:24:59.575 "progress": { 00:24:59.575 "blocks": 14336, 00:24:59.575 "percent": 21 00:24:59.575 } 00:24:59.575 }, 00:24:59.575 "base_bdevs_list": [ 00:24:59.575 { 00:24:59.575 "name": "spare", 00:24:59.575 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:24:59.575 "is_configured": true, 00:24:59.575 "data_offset": 0, 00:24:59.575 "data_size": 65536 00:24:59.575 }, 00:24:59.575 { 00:24:59.575 "name": "BaseBdev2", 00:24:59.575 "uuid": "06c5035e-2aa0-592f-a197-cbf89cb07d0a", 00:24:59.575 "is_configured": true, 00:24:59.575 "data_offset": 0, 00:24:59.575 "data_size": 65536 00:24:59.575 }, 00:24:59.575 { 00:24:59.575 "name": "BaseBdev3", 00:24:59.575 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:24:59.575 "is_configured": true, 00:24:59.575 "data_offset": 0, 00:24:59.575 "data_size": 65536 00:24:59.575 }, 00:24:59.575 { 00:24:59.575 "name": "BaseBdev4", 00:24:59.575 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:24:59.575 "is_configured": true, 00:24:59.575 "data_offset": 0, 00:24:59.575 "data_size": 65536 00:24:59.575 } 00:24:59.575 ] 00:24:59.575 }' 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@681 -- # '[' false = true ']' 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:24:59.575 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:59.834 [2024-07-25 13:33:40.499634] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:59.834 [2024-07-25 13:33:40.502071] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:00.095 [2024-07-25 13:33:40.710307] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:00.095 [2024-07-25 13:33:40.740685] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xdf3c40 00:25:00.095 [2024-07-25 13:33:40.740702] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0xc515d0 00:25:00.095 [2024-07-25 13:33:40.741095] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:00.095 [2024-07-25 13:33:40.747694] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:00.095 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.096 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.356 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.356 "name": "raid_bdev1", 00:25:00.356 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:25:00.356 "strip_size_kb": 0, 00:25:00.356 "state": "online", 00:25:00.356 "raid_level": "raid1", 00:25:00.356 "superblock": false, 00:25:00.356 "num_base_bdevs": 4, 00:25:00.356 "num_base_bdevs_discovered": 3, 00:25:00.356 "num_base_bdevs_operational": 3, 00:25:00.356 "process": { 00:25:00.356 "type": "rebuild", 00:25:00.356 "target": "spare", 00:25:00.356 "progress": { 00:25:00.356 "blocks": 22528, 00:25:00.356 "percent": 34 00:25:00.356 } 00:25:00.356 }, 00:25:00.356 "base_bdevs_list": [ 00:25:00.356 { 00:25:00.356 "name": "spare", 00:25:00.356 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:25:00.356 "is_configured": true, 00:25:00.356 "data_offset": 0, 00:25:00.356 "data_size": 65536 00:25:00.356 }, 00:25:00.356 { 00:25:00.356 "name": null, 00:25:00.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:00.356 "is_configured": false, 00:25:00.356 "data_offset": 0, 00:25:00.356 "data_size": 65536 00:25:00.356 }, 00:25:00.356 { 00:25:00.356 "name": "BaseBdev3", 00:25:00.356 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:25:00.356 "is_configured": true, 00:25:00.356 "data_offset": 0, 00:25:00.356 "data_size": 65536 00:25:00.356 }, 00:25:00.356 { 00:25:00.356 "name": "BaseBdev4", 00:25:00.356 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:25:00.356 "is_configured": true, 00:25:00.356 "data_offset": 0, 00:25:00.356 "data_size": 65536 00:25:00.356 } 00:25:00.356 ] 00:25:00.356 }' 00:25:00.356 13:33:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # local timeout=878 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:00.356 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:00.356 [2024-07-25 13:33:41.081238] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:25:00.616 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:00.616 "name": "raid_bdev1", 00:25:00.616 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:25:00.616 "strip_size_kb": 0, 00:25:00.616 "state": "online", 00:25:00.616 "raid_level": "raid1", 00:25:00.616 "superblock": false, 00:25:00.616 "num_base_bdevs": 4, 00:25:00.616 "num_base_bdevs_discovered": 3, 00:25:00.616 "num_base_bdevs_operational": 3, 00:25:00.616 "process": { 00:25:00.616 "type": "rebuild", 00:25:00.616 "target": "spare", 00:25:00.616 "progress": { 00:25:00.616 "blocks": 26624, 00:25:00.616 "percent": 40 00:25:00.616 } 00:25:00.616 }, 00:25:00.616 "base_bdevs_list": [ 00:25:00.616 { 00:25:00.616 "name": "spare", 00:25:00.616 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:25:00.616 "is_configured": true, 00:25:00.616 "data_offset": 0, 00:25:00.616 "data_size": 65536 00:25:00.616 }, 00:25:00.616 { 00:25:00.616 "name": null, 00:25:00.616 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:00.616 "is_configured": false, 00:25:00.616 "data_offset": 0, 00:25:00.616 "data_size": 65536 00:25:00.616 }, 00:25:00.616 { 00:25:00.616 "name": "BaseBdev3", 00:25:00.616 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:25:00.616 "is_configured": true, 00:25:00.616 "data_offset": 0, 00:25:00.616 "data_size": 65536 00:25:00.616 }, 00:25:00.616 { 00:25:00.616 "name": "BaseBdev4", 00:25:00.616 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:25:00.616 "is_configured": true, 00:25:00.616 "data_offset": 0, 00:25:00.616 "data_size": 65536 00:25:00.616 } 00:25:00.616 ] 00:25:00.616 }' 00:25:00.616 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:00.616 [2024-07-25 13:33:41.290258] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:25:00.616 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:00.616 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:00.616 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:00.616 13:33:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:00.876 [2024-07-25 13:33:41.631610] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:00.876 [2024-07-25 13:33:41.631895] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:01.445 [2024-07-25 13:33:41.942342] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:25:01.445 [2024-07-25 13:33:42.058261] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.705 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.705 [2024-07-25 13:33:42.391023] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:25:01.965 [2024-07-25 13:33:42.506168] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:25:01.965 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.965 "name": "raid_bdev1", 00:25:01.965 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:25:01.965 "strip_size_kb": 0, 00:25:01.965 "state": "online", 00:25:01.965 "raid_level": "raid1", 00:25:01.965 "superblock": false, 00:25:01.965 "num_base_bdevs": 4, 00:25:01.965 "num_base_bdevs_discovered": 3, 00:25:01.965 "num_base_bdevs_operational": 3, 00:25:01.965 "process": { 00:25:01.965 "type": "rebuild", 00:25:01.965 "target": "spare", 00:25:01.965 "progress": { 00:25:01.965 "blocks": 47104, 00:25:01.965 "percent": 71 00:25:01.965 } 00:25:01.965 }, 00:25:01.965 "base_bdevs_list": [ 00:25:01.965 { 00:25:01.965 "name": "spare", 00:25:01.965 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:25:01.965 "is_configured": true, 00:25:01.965 "data_offset": 0, 00:25:01.965 "data_size": 65536 00:25:01.965 }, 00:25:01.965 { 00:25:01.965 "name": null, 00:25:01.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.965 "is_configured": false, 00:25:01.965 "data_offset": 0, 00:25:01.965 "data_size": 65536 00:25:01.965 }, 00:25:01.965 { 00:25:01.965 "name": "BaseBdev3", 00:25:01.965 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:25:01.965 "is_configured": true, 00:25:01.965 "data_offset": 0, 00:25:01.965 "data_size": 65536 00:25:01.965 }, 00:25:01.965 { 00:25:01.965 "name": "BaseBdev4", 00:25:01.965 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:25:01.965 "is_configured": true, 00:25:01.965 "data_offset": 0, 00:25:01.965 "data_size": 65536 00:25:01.965 } 00:25:01.965 ] 00:25:01.965 }' 00:25:01.965 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.965 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.965 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.965 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.965 13:33:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:02.225 [2024-07-25 13:33:42.830433] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:03.163 [2024-07-25 13:33:43.585951] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.163 [2024-07-25 13:33:43.686219] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:03.163 [2024-07-25 13:33:43.687758] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.163 "name": "raid_bdev1", 00:25:03.163 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:25:03.163 "strip_size_kb": 0, 00:25:03.163 "state": "online", 00:25:03.163 "raid_level": "raid1", 00:25:03.163 "superblock": false, 00:25:03.163 "num_base_bdevs": 4, 00:25:03.163 "num_base_bdevs_discovered": 3, 00:25:03.163 "num_base_bdevs_operational": 3, 00:25:03.163 "base_bdevs_list": [ 00:25:03.163 { 00:25:03.163 "name": "spare", 00:25:03.163 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:25:03.163 "is_configured": true, 00:25:03.163 "data_offset": 0, 00:25:03.163 "data_size": 65536 00:25:03.163 }, 00:25:03.163 { 00:25:03.163 "name": null, 00:25:03.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.163 "is_configured": false, 00:25:03.163 "data_offset": 0, 00:25:03.163 "data_size": 65536 00:25:03.163 }, 00:25:03.163 { 00:25:03.163 "name": "BaseBdev3", 00:25:03.163 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:25:03.163 "is_configured": true, 00:25:03.163 "data_offset": 0, 00:25:03.163 "data_size": 65536 00:25:03.163 }, 00:25:03.163 { 00:25:03.163 "name": "BaseBdev4", 00:25:03.163 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:25:03.163 "is_configured": true, 00:25:03.163 "data_offset": 0, 00:25:03.163 "data_size": 65536 00:25:03.163 } 00:25:03.163 ] 00:25:03.163 }' 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # break 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.163 13:33:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.423 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.423 "name": "raid_bdev1", 00:25:03.423 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:25:03.423 "strip_size_kb": 0, 00:25:03.423 "state": "online", 00:25:03.423 "raid_level": "raid1", 00:25:03.423 "superblock": false, 00:25:03.423 "num_base_bdevs": 4, 00:25:03.423 "num_base_bdevs_discovered": 3, 00:25:03.423 "num_base_bdevs_operational": 3, 00:25:03.423 "base_bdevs_list": [ 00:25:03.423 { 00:25:03.423 "name": "spare", 00:25:03.423 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:25:03.423 "is_configured": true, 00:25:03.423 "data_offset": 0, 00:25:03.423 "data_size": 65536 00:25:03.423 }, 00:25:03.423 { 00:25:03.423 "name": null, 00:25:03.423 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.423 "is_configured": false, 00:25:03.423 "data_offset": 0, 00:25:03.423 "data_size": 65536 00:25:03.423 }, 00:25:03.423 { 00:25:03.423 "name": "BaseBdev3", 00:25:03.423 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:25:03.423 "is_configured": true, 00:25:03.423 "data_offset": 0, 00:25:03.423 "data_size": 65536 00:25:03.423 }, 00:25:03.423 { 00:25:03.423 "name": "BaseBdev4", 00:25:03.423 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:25:03.423 "is_configured": true, 00:25:03.423 "data_offset": 0, 00:25:03.423 "data_size": 65536 00:25:03.423 } 00:25:03.423 ] 00:25:03.423 }' 00:25:03.423 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.423 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:03.423 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.682 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:03.682 "name": "raid_bdev1", 00:25:03.682 "uuid": "aa83052c-f7d7-4ebd-82ea-17a4a30dd75f", 00:25:03.682 "strip_size_kb": 0, 00:25:03.682 "state": "online", 00:25:03.682 "raid_level": "raid1", 00:25:03.682 "superblock": false, 00:25:03.682 "num_base_bdevs": 4, 00:25:03.682 "num_base_bdevs_discovered": 3, 00:25:03.682 "num_base_bdevs_operational": 3, 00:25:03.682 "base_bdevs_list": [ 00:25:03.682 { 00:25:03.682 "name": "spare", 00:25:03.682 "uuid": "88b76461-d8a3-5240-a54f-a98f3e9cbd4a", 00:25:03.682 "is_configured": true, 00:25:03.682 "data_offset": 0, 00:25:03.682 "data_size": 65536 00:25:03.682 }, 00:25:03.682 { 00:25:03.682 "name": null, 00:25:03.682 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.682 "is_configured": false, 00:25:03.682 "data_offset": 0, 00:25:03.682 "data_size": 65536 00:25:03.682 }, 00:25:03.682 { 00:25:03.682 "name": "BaseBdev3", 00:25:03.682 "uuid": "9894832f-b213-5f9b-a2b6-fc6a139e4840", 00:25:03.682 "is_configured": true, 00:25:03.682 "data_offset": 0, 00:25:03.682 "data_size": 65536 00:25:03.682 }, 00:25:03.682 { 00:25:03.682 "name": "BaseBdev4", 00:25:03.682 "uuid": "8bdc4e4f-8933-5ed1-9ffc-19168aad6c33", 00:25:03.682 "is_configured": true, 00:25:03.682 "data_offset": 0, 00:25:03.682 "data_size": 65536 00:25:03.682 } 00:25:03.682 ] 00:25:03.683 }' 00:25:03.683 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:03.683 13:33:44 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:04.251 13:33:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:04.510 [2024-07-25 13:33:45.113688] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:04.510 [2024-07-25 13:33:45.113714] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:04.510 00:25:04.510 Latency(us) 00:25:04.510 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:04.510 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:04.510 raid_bdev1 : 10.17 102.17 306.50 0.00 0.00 12535.81 250.49 109697.18 00:25:04.510 =================================================================================================================== 00:25:04.510 Total : 102.17 306.50 0.00 0.00 12535.81 250.49 109697.18 00:25:04.510 [2024-07-25 13:33:45.209141] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:04.510 [2024-07-25 13:33:45.209165] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:04.510 [2024-07-25 13:33:45.209238] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:04.510 [2024-07-25 13:33:45.209244] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc4dfe0 name raid_bdev1, state offline 00:25:04.510 0 00:25:04.510 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.510 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # jq length 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:04.769 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:05.029 /dev/nbd0 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:05.029 1+0 records in 00:25:05.029 1+0 records out 00:25:05.029 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262479 s, 15.6 MB/s 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@743 -- # continue 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.029 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:05.289 /dev/nbd1 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:05.289 1+0 records in 00:25:05.289 1+0 records out 00:25:05.289 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000172105 s, 23.8 MB/s 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:05.289 13:33:45 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:05.566 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:06.135 /dev/nbd1 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@869 -- # local i 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@873 -- # break 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:06.135 1+0 records in 00:25:06.135 1+0 records out 00:25:06.135 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272164 s, 15.0 MB/s 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # size=4096 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@889 -- # return 0 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@746 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:06.135 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:06.396 13:33:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@758 -- # '[' false = true ']' 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@798 -- # killprocess 1024647 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@950 -- # '[' -z 1024647 ']' 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # kill -0 1024647 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # uname 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:06.396 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1024647 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1024647' 00:25:06.657 killing process with pid 1024647 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@969 -- # kill 1024647 00:25:06.657 Received shutdown signal, test time was about 12.165000 seconds 00:25:06.657 00:25:06.657 Latency(us) 00:25:06.657 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:06.657 =================================================================================================================== 00:25:06.657 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:06.657 [2024-07-25 13:33:47.205831] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@974 -- # wait 1024647 00:25:06.657 [2024-07-25 13:33:47.228798] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@800 -- # return 0 00:25:06.657 00:25:06.657 real 0m16.718s 00:25:06.657 user 0m26.165s 00:25:06.657 sys 0m2.359s 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:25:06.657 ************************************ 00:25:06.657 END TEST raid_rebuild_test_io 00:25:06.657 ************************************ 00:25:06.657 13:33:47 bdev_raid -- bdev/bdev_raid.sh@960 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:25:06.657 13:33:47 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:06.657 13:33:47 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:06.657 13:33:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:06.657 ************************************ 00:25:06.657 START TEST raid_rebuild_test_sb_io 00:25:06.657 ************************************ 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 4 true true true 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=4 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@587 -- # local background_io=true 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev3 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # echo BaseBdev4 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # raid_pid=1027651 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@613 -- # waitforlisten 1027651 /var/tmp/spdk-raid.sock 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@831 -- # '[' -z 1027651 ']' 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:06.657 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:06.657 13:33:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:06.918 [2024-07-25 13:33:47.507174] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:25:06.918 [2024-07-25 13:33:47.507223] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1027651 ] 00:25:06.918 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:06.918 Zero copy mechanism will not be used. 00:25:06.918 [2024-07-25 13:33:47.594306] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:06.918 [2024-07-25 13:33:47.659571] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:06.918 [2024-07-25 13:33:47.702491] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:06.918 [2024-07-25 13:33:47.702513] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:07.858 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:07.858 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@864 -- # return 0 00:25:07.858 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:07.858 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:25:07.858 BaseBdev1_malloc 00:25:07.858 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:08.117 [2024-07-25 13:33:48.680937] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:08.117 [2024-07-25 13:33:48.680971] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.117 [2024-07-25 13:33:48.680986] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x104ad10 00:25:08.117 [2024-07-25 13:33:48.680993] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.117 [2024-07-25 13:33:48.682253] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.117 [2024-07-25 13:33:48.682274] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:08.117 BaseBdev1 00:25:08.117 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:08.117 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:25:08.117 BaseBdev2_malloc 00:25:08.117 13:33:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:08.377 [2024-07-25 13:33:49.063982] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:08.377 [2024-07-25 13:33:49.064011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.377 [2024-07-25 13:33:49.064025] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x104b6d0 00:25:08.377 [2024-07-25 13:33:49.064031] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.377 [2024-07-25 13:33:49.065209] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.377 [2024-07-25 13:33:49.065228] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:08.377 BaseBdev2 00:25:08.377 13:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:08.377 13:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:25:08.637 BaseBdev3_malloc 00:25:08.637 13:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:25:08.897 [2024-07-25 13:33:49.450954] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:25:08.897 [2024-07-25 13:33:49.450983] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:08.897 [2024-07-25 13:33:49.450994] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x110ea30 00:25:08.897 [2024-07-25 13:33:49.451001] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:08.897 [2024-07-25 13:33:49.452185] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:08.897 [2024-07-25 13:33:49.452204] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:25:08.897 BaseBdev3 00:25:08.897 13:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:08.897 13:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:25:08.897 BaseBdev4_malloc 00:25:08.897 13:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:25:09.156 [2024-07-25 13:33:49.837809] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:25:09.156 [2024-07-25 13:33:49.837837] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:09.156 [2024-07-25 13:33:49.837849] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1043d60 00:25:09.156 [2024-07-25 13:33:49.837860] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:09.156 [2024-07-25 13:33:49.839043] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:09.156 [2024-07-25 13:33:49.839062] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:25:09.156 BaseBdev4 00:25:09.156 13:33:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:25:09.416 spare_malloc 00:25:09.416 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:09.676 spare_delay 00:25:09.676 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:09.676 [2024-07-25 13:33:50.409289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:09.676 [2024-07-25 13:33:50.409320] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:09.676 [2024-07-25 13:33:50.409334] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1045820 00:25:09.676 [2024-07-25 13:33:50.409340] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:09.676 [2024-07-25 13:33:50.410557] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:09.676 [2024-07-25 13:33:50.410575] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:09.676 spare 00:25:09.676 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:25:09.962 [2024-07-25 13:33:50.597791] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:09.962 [2024-07-25 13:33:50.598798] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:09.962 [2024-07-25 13:33:50.598841] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:09.962 [2024-07-25 13:33:50.598875] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:09.962 [2024-07-25 13:33:50.599006] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1046fe0 00:25:09.962 [2024-07-25 13:33:50.599014] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:09.962 [2024-07-25 13:33:50.599171] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1045490 00:25:09.962 [2024-07-25 13:33:50.599288] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1046fe0 00:25:09.962 [2024-07-25 13:33:50.599294] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1046fe0 00:25:09.962 [2024-07-25 13:33:50.599377] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:09.962 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:10.223 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.223 "name": "raid_bdev1", 00:25:10.223 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:10.223 "strip_size_kb": 0, 00:25:10.223 "state": "online", 00:25:10.223 "raid_level": "raid1", 00:25:10.223 "superblock": true, 00:25:10.223 "num_base_bdevs": 4, 00:25:10.223 "num_base_bdevs_discovered": 4, 00:25:10.223 "num_base_bdevs_operational": 4, 00:25:10.223 "base_bdevs_list": [ 00:25:10.223 { 00:25:10.224 "name": "BaseBdev1", 00:25:10.224 "uuid": "8bed1e8c-7285-58f2-8df0-5cb652995f7a", 00:25:10.224 "is_configured": true, 00:25:10.224 "data_offset": 2048, 00:25:10.224 "data_size": 63488 00:25:10.224 }, 00:25:10.224 { 00:25:10.224 "name": "BaseBdev2", 00:25:10.224 "uuid": "68a14383-9ac9-500b-a1d8-3e734018a539", 00:25:10.224 "is_configured": true, 00:25:10.224 "data_offset": 2048, 00:25:10.224 "data_size": 63488 00:25:10.224 }, 00:25:10.224 { 00:25:10.224 "name": "BaseBdev3", 00:25:10.224 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:10.224 "is_configured": true, 00:25:10.224 "data_offset": 2048, 00:25:10.224 "data_size": 63488 00:25:10.224 }, 00:25:10.224 { 00:25:10.224 "name": "BaseBdev4", 00:25:10.224 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:10.224 "is_configured": true, 00:25:10.224 "data_offset": 2048, 00:25:10.224 "data_size": 63488 00:25:10.224 } 00:25:10.224 ] 00:25:10.224 }' 00:25:10.224 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.224 13:33:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:10.793 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:10.793 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:25:10.793 [2024-07-25 13:33:51.556435] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:10.793 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=63488 00:25:10.793 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.793 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:11.053 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@634 -- # data_offset=2048 00:25:11.053 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@636 -- # '[' true = true ']' 00:25:11.053 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:11.053 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@638 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:25:11.314 [2024-07-25 13:33:51.862414] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ecc40 00:25:11.314 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:11.314 Zero copy mechanism will not be used. 00:25:11.314 Running I/O for 60 seconds... 00:25:11.314 [2024-07-25 13:33:51.944778] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:11.314 [2024-07-25 13:33:51.951244] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x11ecc40 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:11.314 13:33:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:11.614 13:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:11.614 "name": "raid_bdev1", 00:25:11.614 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:11.614 "strip_size_kb": 0, 00:25:11.614 "state": "online", 00:25:11.614 "raid_level": "raid1", 00:25:11.614 "superblock": true, 00:25:11.614 "num_base_bdevs": 4, 00:25:11.614 "num_base_bdevs_discovered": 3, 00:25:11.614 "num_base_bdevs_operational": 3, 00:25:11.614 "base_bdevs_list": [ 00:25:11.614 { 00:25:11.614 "name": null, 00:25:11.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:11.614 "is_configured": false, 00:25:11.614 "data_offset": 2048, 00:25:11.614 "data_size": 63488 00:25:11.614 }, 00:25:11.614 { 00:25:11.614 "name": "BaseBdev2", 00:25:11.614 "uuid": "68a14383-9ac9-500b-a1d8-3e734018a539", 00:25:11.614 "is_configured": true, 00:25:11.614 "data_offset": 2048, 00:25:11.614 "data_size": 63488 00:25:11.614 }, 00:25:11.614 { 00:25:11.614 "name": "BaseBdev3", 00:25:11.614 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:11.614 "is_configured": true, 00:25:11.614 "data_offset": 2048, 00:25:11.614 "data_size": 63488 00:25:11.614 }, 00:25:11.614 { 00:25:11.614 "name": "BaseBdev4", 00:25:11.614 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:11.614 "is_configured": true, 00:25:11.614 "data_offset": 2048, 00:25:11.614 "data_size": 63488 00:25:11.614 } 00:25:11.614 ] 00:25:11.614 }' 00:25:11.614 13:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:11.614 13:33:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:12.185 13:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:12.185 [2024-07-25 13:33:52.908304] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:12.185 13:33:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:12.185 [2024-07-25 13:33:52.964692] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11ecee0 00:25:12.185 [2024-07-25 13:33:52.966358] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:12.445 [2024-07-25 13:33:53.075517] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:12.445 [2024-07-25 13:33:53.076271] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:12.709 [2024-07-25 13:33:53.298016] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:12.709 [2024-07-25 13:33:53.304538] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:13.279 [2024-07-25 13:33:53.786852] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:13.279 13:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:13.279 13:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:13.279 13:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:13.279 13:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:13.279 13:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:13.279 13:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.279 13:33:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:13.279 [2024-07-25 13:33:54.004269] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:13.279 [2024-07-25 13:33:54.005044] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:13.539 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:13.539 "name": "raid_bdev1", 00:25:13.539 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:13.539 "strip_size_kb": 0, 00:25:13.539 "state": "online", 00:25:13.539 "raid_level": "raid1", 00:25:13.539 "superblock": true, 00:25:13.539 "num_base_bdevs": 4, 00:25:13.539 "num_base_bdevs_discovered": 4, 00:25:13.539 "num_base_bdevs_operational": 4, 00:25:13.539 "process": { 00:25:13.539 "type": "rebuild", 00:25:13.539 "target": "spare", 00:25:13.539 "progress": { 00:25:13.539 "blocks": 14336, 00:25:13.539 "percent": 22 00:25:13.539 } 00:25:13.539 }, 00:25:13.539 "base_bdevs_list": [ 00:25:13.539 { 00:25:13.539 "name": "spare", 00:25:13.539 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:13.539 "is_configured": true, 00:25:13.539 "data_offset": 2048, 00:25:13.539 "data_size": 63488 00:25:13.539 }, 00:25:13.539 { 00:25:13.539 "name": "BaseBdev2", 00:25:13.539 "uuid": "68a14383-9ac9-500b-a1d8-3e734018a539", 00:25:13.539 "is_configured": true, 00:25:13.539 "data_offset": 2048, 00:25:13.539 "data_size": 63488 00:25:13.539 }, 00:25:13.539 { 00:25:13.539 "name": "BaseBdev3", 00:25:13.539 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:13.539 "is_configured": true, 00:25:13.539 "data_offset": 2048, 00:25:13.539 "data_size": 63488 00:25:13.539 }, 00:25:13.539 { 00:25:13.539 "name": "BaseBdev4", 00:25:13.539 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:13.539 "is_configured": true, 00:25:13.539 "data_offset": 2048, 00:25:13.539 "data_size": 63488 00:25:13.539 } 00:25:13.539 ] 00:25:13.539 }' 00:25:13.539 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:13.539 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:13.540 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:13.540 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:13.540 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:13.799 [2024-07-25 13:33:54.416199] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:13.799 [2024-07-25 13:33:54.576852] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:14.059 [2024-07-25 13:33:54.593789] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:14.059 [2024-07-25 13:33:54.593810] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:14.059 [2024-07-25 13:33:54.593816] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:14.059 [2024-07-25 13:33:54.610488] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x11ecc40 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.059 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.320 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:14.320 "name": "raid_bdev1", 00:25:14.320 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:14.320 "strip_size_kb": 0, 00:25:14.320 "state": "online", 00:25:14.320 "raid_level": "raid1", 00:25:14.320 "superblock": true, 00:25:14.320 "num_base_bdevs": 4, 00:25:14.320 "num_base_bdevs_discovered": 3, 00:25:14.320 "num_base_bdevs_operational": 3, 00:25:14.320 "base_bdevs_list": [ 00:25:14.320 { 00:25:14.320 "name": null, 00:25:14.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.320 "is_configured": false, 00:25:14.320 "data_offset": 2048, 00:25:14.320 "data_size": 63488 00:25:14.320 }, 00:25:14.320 { 00:25:14.320 "name": "BaseBdev2", 00:25:14.320 "uuid": "68a14383-9ac9-500b-a1d8-3e734018a539", 00:25:14.320 "is_configured": true, 00:25:14.320 "data_offset": 2048, 00:25:14.320 "data_size": 63488 00:25:14.320 }, 00:25:14.320 { 00:25:14.320 "name": "BaseBdev3", 00:25:14.320 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:14.320 "is_configured": true, 00:25:14.320 "data_offset": 2048, 00:25:14.320 "data_size": 63488 00:25:14.320 }, 00:25:14.320 { 00:25:14.320 "name": "BaseBdev4", 00:25:14.320 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:14.320 "is_configured": true, 00:25:14.320 "data_offset": 2048, 00:25:14.320 "data_size": 63488 00:25:14.320 } 00:25:14.320 ] 00:25:14.320 }' 00:25:14.320 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:14.320 13:33:54 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:14.890 "name": "raid_bdev1", 00:25:14.890 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:14.890 "strip_size_kb": 0, 00:25:14.890 "state": "online", 00:25:14.890 "raid_level": "raid1", 00:25:14.890 "superblock": true, 00:25:14.890 "num_base_bdevs": 4, 00:25:14.890 "num_base_bdevs_discovered": 3, 00:25:14.890 "num_base_bdevs_operational": 3, 00:25:14.890 "base_bdevs_list": [ 00:25:14.890 { 00:25:14.890 "name": null, 00:25:14.890 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:14.890 "is_configured": false, 00:25:14.890 "data_offset": 2048, 00:25:14.890 "data_size": 63488 00:25:14.890 }, 00:25:14.890 { 00:25:14.890 "name": "BaseBdev2", 00:25:14.890 "uuid": "68a14383-9ac9-500b-a1d8-3e734018a539", 00:25:14.890 "is_configured": true, 00:25:14.890 "data_offset": 2048, 00:25:14.890 "data_size": 63488 00:25:14.890 }, 00:25:14.890 { 00:25:14.890 "name": "BaseBdev3", 00:25:14.890 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:14.890 "is_configured": true, 00:25:14.890 "data_offset": 2048, 00:25:14.890 "data_size": 63488 00:25:14.890 }, 00:25:14.890 { 00:25:14.890 "name": "BaseBdev4", 00:25:14.890 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:14.890 "is_configured": true, 00:25:14.890 "data_offset": 2048, 00:25:14.890 "data_size": 63488 00:25:14.890 } 00:25:14.890 ] 00:25:14.890 }' 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:14.890 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:15.150 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:15.150 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:15.150 [2024-07-25 13:33:55.885910] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:15.150 13:33:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@678 -- # sleep 1 00:25:15.150 [2024-07-25 13:33:55.935526] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f8260 00:25:15.150 [2024-07-25 13:33:55.936708] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:15.410 [2024-07-25 13:33:56.059875] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:15.410 [2024-07-25 13:33:56.060137] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:25:15.673 [2024-07-25 13:33:56.269946] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:15.675 [2024-07-25 13:33:56.270313] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:25:15.939 [2024-07-25 13:33:56.710659] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:25:16.198 13:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:16.198 13:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:16.198 13:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:16.198 13:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:16.198 13:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:16.198 13:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:16.198 13:33:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:16.198 [2024-07-25 13:33:56.951669] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:25:16.459 [2024-07-25 13:33:57.076108] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:25:16.459 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:16.459 "name": "raid_bdev1", 00:25:16.459 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:16.459 "strip_size_kb": 0, 00:25:16.459 "state": "online", 00:25:16.459 "raid_level": "raid1", 00:25:16.459 "superblock": true, 00:25:16.459 "num_base_bdevs": 4, 00:25:16.459 "num_base_bdevs_discovered": 4, 00:25:16.459 "num_base_bdevs_operational": 4, 00:25:16.459 "process": { 00:25:16.459 "type": "rebuild", 00:25:16.459 "target": "spare", 00:25:16.459 "progress": { 00:25:16.459 "blocks": 16384, 00:25:16.459 "percent": 25 00:25:16.459 } 00:25:16.459 }, 00:25:16.459 "base_bdevs_list": [ 00:25:16.459 { 00:25:16.459 "name": "spare", 00:25:16.459 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:16.459 "is_configured": true, 00:25:16.459 "data_offset": 2048, 00:25:16.459 "data_size": 63488 00:25:16.459 }, 00:25:16.459 { 00:25:16.459 "name": "BaseBdev2", 00:25:16.459 "uuid": "68a14383-9ac9-500b-a1d8-3e734018a539", 00:25:16.459 "is_configured": true, 00:25:16.459 "data_offset": 2048, 00:25:16.459 "data_size": 63488 00:25:16.459 }, 00:25:16.459 { 00:25:16.459 "name": "BaseBdev3", 00:25:16.459 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:16.459 "is_configured": true, 00:25:16.459 "data_offset": 2048, 00:25:16.459 "data_size": 63488 00:25:16.459 }, 00:25:16.459 { 00:25:16.459 "name": "BaseBdev4", 00:25:16.459 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:16.459 "is_configured": true, 00:25:16.459 "data_offset": 2048, 00:25:16.459 "data_size": 63488 00:25:16.459 } 00:25:16.459 ] 00:25:16.459 }' 00:25:16.459 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:16.459 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:16.459 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:16.460 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:16.460 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:25:16.460 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:25:16.460 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:25:16.460 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=4 00:25:16.460 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:25:16.460 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # '[' 4 -gt 2 ']' 00:25:16.460 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:25:16.719 [2024-07-25 13:33:57.345320] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:16.719 [2024-07-25 13:33:57.345553] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:25:16.719 [2024-07-25 13:33:57.401520] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:16.979 [2024-07-25 13:33:57.578456] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:25:17.239 [2024-07-25 13:33:57.778983] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x11ecc40 00:25:17.239 [2024-07-25 13:33:57.779002] bdev_raid.c:1961:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x11f8260 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@713 -- # base_bdevs[1]= 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # (( num_base_bdevs_operational-- )) 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@717 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.239 13:33:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.239 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:17.239 "name": "raid_bdev1", 00:25:17.239 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:17.239 "strip_size_kb": 0, 00:25:17.239 "state": "online", 00:25:17.239 "raid_level": "raid1", 00:25:17.239 "superblock": true, 00:25:17.239 "num_base_bdevs": 4, 00:25:17.239 "num_base_bdevs_discovered": 3, 00:25:17.239 "num_base_bdevs_operational": 3, 00:25:17.239 "process": { 00:25:17.239 "type": "rebuild", 00:25:17.239 "target": "spare", 00:25:17.239 "progress": { 00:25:17.239 "blocks": 24576, 00:25:17.239 "percent": 38 00:25:17.239 } 00:25:17.239 }, 00:25:17.239 "base_bdevs_list": [ 00:25:17.239 { 00:25:17.239 "name": "spare", 00:25:17.239 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:17.239 "is_configured": true, 00:25:17.239 "data_offset": 2048, 00:25:17.239 "data_size": 63488 00:25:17.239 }, 00:25:17.239 { 00:25:17.239 "name": null, 00:25:17.239 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.239 "is_configured": false, 00:25:17.239 "data_offset": 2048, 00:25:17.239 "data_size": 63488 00:25:17.239 }, 00:25:17.239 { 00:25:17.239 "name": "BaseBdev3", 00:25:17.239 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:17.239 "is_configured": true, 00:25:17.239 "data_offset": 2048, 00:25:17.239 "data_size": 63488 00:25:17.239 }, 00:25:17.239 { 00:25:17.239 "name": "BaseBdev4", 00:25:17.239 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:17.239 "is_configured": true, 00:25:17.239 "data_offset": 2048, 00:25:17.239 "data_size": 63488 00:25:17.239 } 00:25:17.239 ] 00:25:17.239 }' 00:25:17.239 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:17.239 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # local timeout=895 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:17.499 "name": "raid_bdev1", 00:25:17.499 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:17.499 "strip_size_kb": 0, 00:25:17.499 "state": "online", 00:25:17.499 "raid_level": "raid1", 00:25:17.499 "superblock": true, 00:25:17.499 "num_base_bdevs": 4, 00:25:17.499 "num_base_bdevs_discovered": 3, 00:25:17.499 "num_base_bdevs_operational": 3, 00:25:17.499 "process": { 00:25:17.499 "type": "rebuild", 00:25:17.499 "target": "spare", 00:25:17.499 "progress": { 00:25:17.499 "blocks": 28672, 00:25:17.499 "percent": 45 00:25:17.499 } 00:25:17.499 }, 00:25:17.499 "base_bdevs_list": [ 00:25:17.499 { 00:25:17.499 "name": "spare", 00:25:17.499 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:17.499 "is_configured": true, 00:25:17.499 "data_offset": 2048, 00:25:17.499 "data_size": 63488 00:25:17.499 }, 00:25:17.499 { 00:25:17.499 "name": null, 00:25:17.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.499 "is_configured": false, 00:25:17.499 "data_offset": 2048, 00:25:17.499 "data_size": 63488 00:25:17.499 }, 00:25:17.499 { 00:25:17.499 "name": "BaseBdev3", 00:25:17.499 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:17.499 "is_configured": true, 00:25:17.499 "data_offset": 2048, 00:25:17.499 "data_size": 63488 00:25:17.499 }, 00:25:17.499 { 00:25:17.499 "name": "BaseBdev4", 00:25:17.499 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:17.499 "is_configured": true, 00:25:17.499 "data_offset": 2048, 00:25:17.499 "data_size": 63488 00:25:17.499 } 00:25:17.499 ] 00:25:17.499 }' 00:25:17.499 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:17.759 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:17.759 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:17.759 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:17.759 13:33:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:17.759 [2024-07-25 13:33:58.482666] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.698 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:18.698 [2024-07-25 13:33:59.451188] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 51200 offset_begin: 49152 offset_end: 55296 00:25:18.958 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:18.958 "name": "raid_bdev1", 00:25:18.958 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:18.958 "strip_size_kb": 0, 00:25:18.958 "state": "online", 00:25:18.958 "raid_level": "raid1", 00:25:18.958 "superblock": true, 00:25:18.958 "num_base_bdevs": 4, 00:25:18.958 "num_base_bdevs_discovered": 3, 00:25:18.958 "num_base_bdevs_operational": 3, 00:25:18.958 "process": { 00:25:18.958 "type": "rebuild", 00:25:18.958 "target": "spare", 00:25:18.958 "progress": { 00:25:18.958 "blocks": 51200, 00:25:18.958 "percent": 80 00:25:18.958 } 00:25:18.958 }, 00:25:18.958 "base_bdevs_list": [ 00:25:18.958 { 00:25:18.958 "name": "spare", 00:25:18.958 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:18.958 "is_configured": true, 00:25:18.958 "data_offset": 2048, 00:25:18.958 "data_size": 63488 00:25:18.958 }, 00:25:18.958 { 00:25:18.958 "name": null, 00:25:18.958 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:18.958 "is_configured": false, 00:25:18.958 "data_offset": 2048, 00:25:18.958 "data_size": 63488 00:25:18.958 }, 00:25:18.958 { 00:25:18.958 "name": "BaseBdev3", 00:25:18.958 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:18.958 "is_configured": true, 00:25:18.958 "data_offset": 2048, 00:25:18.958 "data_size": 63488 00:25:18.958 }, 00:25:18.958 { 00:25:18.958 "name": "BaseBdev4", 00:25:18.958 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:18.958 "is_configured": true, 00:25:18.958 "data_offset": 2048, 00:25:18.958 "data_size": 63488 00:25:18.958 } 00:25:18.958 ] 00:25:18.958 }' 00:25:18.958 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:18.958 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:18.958 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:18.958 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:18.958 13:33:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # sleep 1 00:25:19.218 [2024-07-25 13:33:59.775790] bdev_raid.c: 852:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:25:19.478 [2024-07-25 13:34:00.106390] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:19.478 [2024-07-25 13:34:00.206653] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:19.478 [2024-07-25 13:34:00.208297] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.047 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:20.307 "name": "raid_bdev1", 00:25:20.307 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:20.307 "strip_size_kb": 0, 00:25:20.307 "state": "online", 00:25:20.307 "raid_level": "raid1", 00:25:20.307 "superblock": true, 00:25:20.307 "num_base_bdevs": 4, 00:25:20.307 "num_base_bdevs_discovered": 3, 00:25:20.307 "num_base_bdevs_operational": 3, 00:25:20.307 "base_bdevs_list": [ 00:25:20.307 { 00:25:20.307 "name": "spare", 00:25:20.307 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:20.307 "is_configured": true, 00:25:20.307 "data_offset": 2048, 00:25:20.307 "data_size": 63488 00:25:20.307 }, 00:25:20.307 { 00:25:20.307 "name": null, 00:25:20.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.307 "is_configured": false, 00:25:20.307 "data_offset": 2048, 00:25:20.307 "data_size": 63488 00:25:20.307 }, 00:25:20.307 { 00:25:20.307 "name": "BaseBdev3", 00:25:20.307 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:20.307 "is_configured": true, 00:25:20.307 "data_offset": 2048, 00:25:20.307 "data_size": 63488 00:25:20.307 }, 00:25:20.307 { 00:25:20.307 "name": "BaseBdev4", 00:25:20.307 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:20.307 "is_configured": true, 00:25:20.307 "data_offset": 2048, 00:25:20.307 "data_size": 63488 00:25:20.307 } 00:25:20.307 ] 00:25:20.307 }' 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # break 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.307 13:34:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:20.567 "name": "raid_bdev1", 00:25:20.567 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:20.567 "strip_size_kb": 0, 00:25:20.567 "state": "online", 00:25:20.567 "raid_level": "raid1", 00:25:20.567 "superblock": true, 00:25:20.567 "num_base_bdevs": 4, 00:25:20.567 "num_base_bdevs_discovered": 3, 00:25:20.567 "num_base_bdevs_operational": 3, 00:25:20.567 "base_bdevs_list": [ 00:25:20.567 { 00:25:20.567 "name": "spare", 00:25:20.567 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:20.567 "is_configured": true, 00:25:20.567 "data_offset": 2048, 00:25:20.567 "data_size": 63488 00:25:20.567 }, 00:25:20.567 { 00:25:20.567 "name": null, 00:25:20.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.567 "is_configured": false, 00:25:20.567 "data_offset": 2048, 00:25:20.567 "data_size": 63488 00:25:20.567 }, 00:25:20.567 { 00:25:20.567 "name": "BaseBdev3", 00:25:20.567 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:20.567 "is_configured": true, 00:25:20.567 "data_offset": 2048, 00:25:20.567 "data_size": 63488 00:25:20.567 }, 00:25:20.567 { 00:25:20.567 "name": "BaseBdev4", 00:25:20.567 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:20.567 "is_configured": true, 00:25:20.567 "data_offset": 2048, 00:25:20.567 "data_size": 63488 00:25:20.567 } 00:25:20.567 ] 00:25:20.567 }' 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:20.567 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:20.827 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:20.827 "name": "raid_bdev1", 00:25:20.827 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:20.827 "strip_size_kb": 0, 00:25:20.827 "state": "online", 00:25:20.827 "raid_level": "raid1", 00:25:20.827 "superblock": true, 00:25:20.827 "num_base_bdevs": 4, 00:25:20.827 "num_base_bdevs_discovered": 3, 00:25:20.827 "num_base_bdevs_operational": 3, 00:25:20.827 "base_bdevs_list": [ 00:25:20.827 { 00:25:20.827 "name": "spare", 00:25:20.827 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:20.827 "is_configured": true, 00:25:20.827 "data_offset": 2048, 00:25:20.827 "data_size": 63488 00:25:20.827 }, 00:25:20.827 { 00:25:20.827 "name": null, 00:25:20.827 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:20.827 "is_configured": false, 00:25:20.827 "data_offset": 2048, 00:25:20.827 "data_size": 63488 00:25:20.827 }, 00:25:20.827 { 00:25:20.827 "name": "BaseBdev3", 00:25:20.827 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:20.827 "is_configured": true, 00:25:20.827 "data_offset": 2048, 00:25:20.827 "data_size": 63488 00:25:20.827 }, 00:25:20.827 { 00:25:20.827 "name": "BaseBdev4", 00:25:20.827 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:20.827 "is_configured": true, 00:25:20.827 "data_offset": 2048, 00:25:20.827 "data_size": 63488 00:25:20.827 } 00:25:20.827 ] 00:25:20.827 }' 00:25:20.827 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:20.827 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:21.440 13:34:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:21.440 [2024-07-25 13:34:02.151248] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:21.440 [2024-07-25 13:34:02.151269] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:21.700 00:25:21.700 Latency(us) 00:25:21.700 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:21.700 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:25:21.700 raid_bdev1 : 10.36 102.96 308.89 0.00 0.00 13113.42 247.34 116956.55 00:25:21.700 =================================================================================================================== 00:25:21.700 Total : 102.96 308.89 0.00 0.00 13113.42 247.34 116956.55 00:25:21.700 [2024-07-25 13:34:02.254802] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:21.700 [2024-07-25 13:34:02.254826] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:21.700 [2024-07-25 13:34:02.254900] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:21.700 [2024-07-25 13:34:02.254907] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1046fe0 name raid_bdev1, state offline 00:25:21.700 0 00:25:21.700 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.700 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # jq length 00:25:21.700 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:25:21.700 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@738 -- # '[' true = true ']' 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@740 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.701 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:25:21.961 /dev/nbd0 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:21.961 1+0 records in 00:25:21.961 1+0 records out 00:25:21.961 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000315238 s, 13.0 MB/s 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:21.961 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z '' ']' 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@743 -- # continue 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev3 ']' 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:21.962 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:25:22.222 /dev/nbd1 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:22.222 1+0 records in 00:25:22.222 1+0 records out 00:25:22.222 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316854 s, 12.9 MB/s 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:22.222 13:34:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:22.222 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:22.222 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:22.222 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:22.222 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:22.222 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:22.222 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:22.222 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@741 -- # for bdev in "${base_bdevs[@]:1}" 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' -z BaseBdev4 ']' 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:22.482 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:25:22.743 /dev/nbd1 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@869 -- # local i 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@873 -- # break 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:22.743 1+0 records in 00:25:22.743 1+0 records out 00:25:22.743 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000276156 s, 14.8 MB/s 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # size=4096 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@889 -- # return 0 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@746 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:22.743 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:23.004 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:25:23.264 13:34:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:23.524 [2024-07-25 13:34:04.236915] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:23.524 [2024-07-25 13:34:04.236947] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:23.524 [2024-07-25 13:34:04.236961] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11f8580 00:25:23.524 [2024-07-25 13:34:04.236968] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:23.524 [2024-07-25 13:34:04.238272] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:23.524 [2024-07-25 13:34:04.238294] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:23.524 [2024-07-25 13:34:04.238354] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:23.524 [2024-07-25 13:34:04.238373] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:23.524 [2024-07-25 13:34:04.238458] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:25:23.524 [2024-07-25 13:34:04.238513] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:25:23.524 spare 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:23.524 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:23.785 [2024-07-25 13:34:04.338811] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1047540 00:25:23.785 [2024-07-25 13:34:04.338821] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:25:23.785 [2024-07-25 13:34:04.338984] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x11f88e0 00:25:23.785 [2024-07-25 13:34:04.339104] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1047540 00:25:23.785 [2024-07-25 13:34:04.339109] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1047540 00:25:23.785 [2024-07-25 13:34:04.339193] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:23.785 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:23.785 "name": "raid_bdev1", 00:25:23.785 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:23.785 "strip_size_kb": 0, 00:25:23.785 "state": "online", 00:25:23.785 "raid_level": "raid1", 00:25:23.785 "superblock": true, 00:25:23.785 "num_base_bdevs": 4, 00:25:23.785 "num_base_bdevs_discovered": 3, 00:25:23.785 "num_base_bdevs_operational": 3, 00:25:23.785 "base_bdevs_list": [ 00:25:23.785 { 00:25:23.785 "name": "spare", 00:25:23.785 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:23.785 "is_configured": true, 00:25:23.785 "data_offset": 2048, 00:25:23.785 "data_size": 63488 00:25:23.785 }, 00:25:23.785 { 00:25:23.785 "name": null, 00:25:23.785 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:23.785 "is_configured": false, 00:25:23.785 "data_offset": 2048, 00:25:23.785 "data_size": 63488 00:25:23.785 }, 00:25:23.785 { 00:25:23.785 "name": "BaseBdev3", 00:25:23.785 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:23.785 "is_configured": true, 00:25:23.785 "data_offset": 2048, 00:25:23.785 "data_size": 63488 00:25:23.785 }, 00:25:23.785 { 00:25:23.785 "name": "BaseBdev4", 00:25:23.785 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:23.785 "is_configured": true, 00:25:23.785 "data_offset": 2048, 00:25:23.785 "data_size": 63488 00:25:23.785 } 00:25:23.785 ] 00:25:23.785 }' 00:25:23.785 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:23.785 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:24.354 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:24.354 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:24.354 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:24.354 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:24.354 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:24.354 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.354 13:34:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:24.614 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:24.614 "name": "raid_bdev1", 00:25:24.614 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:24.614 "strip_size_kb": 0, 00:25:24.614 "state": "online", 00:25:24.614 "raid_level": "raid1", 00:25:24.614 "superblock": true, 00:25:24.614 "num_base_bdevs": 4, 00:25:24.614 "num_base_bdevs_discovered": 3, 00:25:24.614 "num_base_bdevs_operational": 3, 00:25:24.614 "base_bdevs_list": [ 00:25:24.614 { 00:25:24.614 "name": "spare", 00:25:24.614 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:24.614 "is_configured": true, 00:25:24.614 "data_offset": 2048, 00:25:24.614 "data_size": 63488 00:25:24.614 }, 00:25:24.614 { 00:25:24.614 "name": null, 00:25:24.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:24.614 "is_configured": false, 00:25:24.614 "data_offset": 2048, 00:25:24.614 "data_size": 63488 00:25:24.614 }, 00:25:24.614 { 00:25:24.614 "name": "BaseBdev3", 00:25:24.614 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:24.614 "is_configured": true, 00:25:24.614 "data_offset": 2048, 00:25:24.614 "data_size": 63488 00:25:24.614 }, 00:25:24.614 { 00:25:24.614 "name": "BaseBdev4", 00:25:24.614 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:24.614 "is_configured": true, 00:25:24.614 "data_offset": 2048, 00:25:24.614 "data_size": 63488 00:25:24.614 } 00:25:24.614 ] 00:25:24.614 }' 00:25:24.614 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:24.614 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:24.614 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:24.614 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:24.614 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.614 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:24.874 [2024-07-25 13:34:05.624833] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.874 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:25.134 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:25.134 "name": "raid_bdev1", 00:25:25.134 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:25.134 "strip_size_kb": 0, 00:25:25.134 "state": "online", 00:25:25.134 "raid_level": "raid1", 00:25:25.134 "superblock": true, 00:25:25.134 "num_base_bdevs": 4, 00:25:25.134 "num_base_bdevs_discovered": 2, 00:25:25.134 "num_base_bdevs_operational": 2, 00:25:25.134 "base_bdevs_list": [ 00:25:25.134 { 00:25:25.134 "name": null, 00:25:25.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.134 "is_configured": false, 00:25:25.134 "data_offset": 2048, 00:25:25.134 "data_size": 63488 00:25:25.134 }, 00:25:25.134 { 00:25:25.134 "name": null, 00:25:25.134 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:25.134 "is_configured": false, 00:25:25.134 "data_offset": 2048, 00:25:25.134 "data_size": 63488 00:25:25.134 }, 00:25:25.134 { 00:25:25.134 "name": "BaseBdev3", 00:25:25.134 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:25.134 "is_configured": true, 00:25:25.134 "data_offset": 2048, 00:25:25.134 "data_size": 63488 00:25:25.134 }, 00:25:25.134 { 00:25:25.134 "name": "BaseBdev4", 00:25:25.134 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:25.134 "is_configured": true, 00:25:25.134 "data_offset": 2048, 00:25:25.134 "data_size": 63488 00:25:25.134 } 00:25:25.134 ] 00:25:25.134 }' 00:25:25.134 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:25.134 13:34:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:25.704 13:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:25.964 [2024-07-25 13:34:06.539264] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:25.964 [2024-07-25 13:34:06.539379] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:25.964 [2024-07-25 13:34:06.539389] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:25.964 [2024-07-25 13:34:06.539414] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:25.964 [2024-07-25 13:34:06.542441] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1047cd0 00:25:25.964 [2024-07-25 13:34:06.544032] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:25.964 13:34:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # sleep 1 00:25:26.902 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:26.902 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:26.902 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:26.902 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:26.902 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:26.902 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.902 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.163 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:27.163 "name": "raid_bdev1", 00:25:27.163 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:27.163 "strip_size_kb": 0, 00:25:27.163 "state": "online", 00:25:27.163 "raid_level": "raid1", 00:25:27.163 "superblock": true, 00:25:27.163 "num_base_bdevs": 4, 00:25:27.163 "num_base_bdevs_discovered": 3, 00:25:27.163 "num_base_bdevs_operational": 3, 00:25:27.163 "process": { 00:25:27.163 "type": "rebuild", 00:25:27.163 "target": "spare", 00:25:27.163 "progress": { 00:25:27.163 "blocks": 22528, 00:25:27.163 "percent": 35 00:25:27.163 } 00:25:27.163 }, 00:25:27.163 "base_bdevs_list": [ 00:25:27.163 { 00:25:27.163 "name": "spare", 00:25:27.163 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:27.163 "is_configured": true, 00:25:27.163 "data_offset": 2048, 00:25:27.163 "data_size": 63488 00:25:27.163 }, 00:25:27.163 { 00:25:27.163 "name": null, 00:25:27.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.163 "is_configured": false, 00:25:27.163 "data_offset": 2048, 00:25:27.163 "data_size": 63488 00:25:27.163 }, 00:25:27.163 { 00:25:27.163 "name": "BaseBdev3", 00:25:27.163 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:27.163 "is_configured": true, 00:25:27.163 "data_offset": 2048, 00:25:27.163 "data_size": 63488 00:25:27.163 }, 00:25:27.163 { 00:25:27.163 "name": "BaseBdev4", 00:25:27.163 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:27.163 "is_configured": true, 00:25:27.163 "data_offset": 2048, 00:25:27.163 "data_size": 63488 00:25:27.163 } 00:25:27.164 ] 00:25:27.164 }' 00:25:27.164 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:27.164 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:27.164 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:27.164 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:27.164 13:34:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:27.425 [2024-07-25 13:34:08.036999] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.425 [2024-07-25 13:34:08.052994] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:27.425 [2024-07-25 13:34:08.053026] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.425 [2024-07-25 13:34:08.053036] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:27.425 [2024-07-25 13:34:08.053041] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.425 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.685 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:27.685 "name": "raid_bdev1", 00:25:27.685 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:27.685 "strip_size_kb": 0, 00:25:27.685 "state": "online", 00:25:27.685 "raid_level": "raid1", 00:25:27.685 "superblock": true, 00:25:27.685 "num_base_bdevs": 4, 00:25:27.685 "num_base_bdevs_discovered": 2, 00:25:27.685 "num_base_bdevs_operational": 2, 00:25:27.685 "base_bdevs_list": [ 00:25:27.685 { 00:25:27.685 "name": null, 00:25:27.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.685 "is_configured": false, 00:25:27.685 "data_offset": 2048, 00:25:27.685 "data_size": 63488 00:25:27.685 }, 00:25:27.685 { 00:25:27.685 "name": null, 00:25:27.685 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:27.685 "is_configured": false, 00:25:27.685 "data_offset": 2048, 00:25:27.685 "data_size": 63488 00:25:27.685 }, 00:25:27.685 { 00:25:27.685 "name": "BaseBdev3", 00:25:27.685 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:27.685 "is_configured": true, 00:25:27.685 "data_offset": 2048, 00:25:27.685 "data_size": 63488 00:25:27.685 }, 00:25:27.685 { 00:25:27.685 "name": "BaseBdev4", 00:25:27.685 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:27.685 "is_configured": true, 00:25:27.685 "data_offset": 2048, 00:25:27.685 "data_size": 63488 00:25:27.685 } 00:25:27.685 ] 00:25:27.685 }' 00:25:27.685 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:27.685 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:28.255 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:28.255 [2024-07-25 13:34:08.963492] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:28.255 [2024-07-25 13:34:08.963526] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:28.255 [2024-07-25 13:34:08.963541] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11fc570 00:25:28.255 [2024-07-25 13:34:08.963552] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:28.255 [2024-07-25 13:34:08.963859] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:28.255 [2024-07-25 13:34:08.963871] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:28.255 [2024-07-25 13:34:08.963930] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:28.255 [2024-07-25 13:34:08.963937] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:28.255 [2024-07-25 13:34:08.963943] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:28.255 [2024-07-25 13:34:08.963956] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:28.255 [2024-07-25 13:34:08.966853] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x104a630 00:25:28.255 [2024-07-25 13:34:08.968000] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:28.255 spare 00:25:28.255 13:34:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # sleep 1 00:25:29.637 13:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:29.637 13:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:29.637 13:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:29.637 13:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:29.637 13:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:29.637 13:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.637 13:34:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:29.637 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:29.637 "name": "raid_bdev1", 00:25:29.637 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:29.637 "strip_size_kb": 0, 00:25:29.637 "state": "online", 00:25:29.637 "raid_level": "raid1", 00:25:29.637 "superblock": true, 00:25:29.637 "num_base_bdevs": 4, 00:25:29.637 "num_base_bdevs_discovered": 3, 00:25:29.637 "num_base_bdevs_operational": 3, 00:25:29.637 "process": { 00:25:29.637 "type": "rebuild", 00:25:29.637 "target": "spare", 00:25:29.637 "progress": { 00:25:29.637 "blocks": 22528, 00:25:29.637 "percent": 35 00:25:29.637 } 00:25:29.637 }, 00:25:29.637 "base_bdevs_list": [ 00:25:29.637 { 00:25:29.637 "name": "spare", 00:25:29.638 "uuid": "b68d8abe-be15-5fda-9d77-f33b8d615d04", 00:25:29.638 "is_configured": true, 00:25:29.638 "data_offset": 2048, 00:25:29.638 "data_size": 63488 00:25:29.638 }, 00:25:29.638 { 00:25:29.638 "name": null, 00:25:29.638 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:29.638 "is_configured": false, 00:25:29.638 "data_offset": 2048, 00:25:29.638 "data_size": 63488 00:25:29.638 }, 00:25:29.638 { 00:25:29.638 "name": "BaseBdev3", 00:25:29.638 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:29.638 "is_configured": true, 00:25:29.638 "data_offset": 2048, 00:25:29.638 "data_size": 63488 00:25:29.638 }, 00:25:29.638 { 00:25:29.638 "name": "BaseBdev4", 00:25:29.638 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:29.638 "is_configured": true, 00:25:29.638 "data_offset": 2048, 00:25:29.638 "data_size": 63488 00:25:29.638 } 00:25:29.638 ] 00:25:29.638 }' 00:25:29.638 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:29.638 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:29.638 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:29.638 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:29.638 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:29.898 [2024-07-25 13:34:10.464866] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:29.898 [2024-07-25 13:34:10.476862] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:29.898 [2024-07-25 13:34:10.476894] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:29.898 [2024-07-25 13:34:10.476904] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:29.898 [2024-07-25 13:34:10.476909] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:29.898 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.159 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.159 "name": "raid_bdev1", 00:25:30.159 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:30.159 "strip_size_kb": 0, 00:25:30.159 "state": "online", 00:25:30.159 "raid_level": "raid1", 00:25:30.159 "superblock": true, 00:25:30.159 "num_base_bdevs": 4, 00:25:30.159 "num_base_bdevs_discovered": 2, 00:25:30.159 "num_base_bdevs_operational": 2, 00:25:30.159 "base_bdevs_list": [ 00:25:30.159 { 00:25:30.159 "name": null, 00:25:30.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.159 "is_configured": false, 00:25:30.159 "data_offset": 2048, 00:25:30.159 "data_size": 63488 00:25:30.159 }, 00:25:30.159 { 00:25:30.159 "name": null, 00:25:30.159 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.159 "is_configured": false, 00:25:30.159 "data_offset": 2048, 00:25:30.159 "data_size": 63488 00:25:30.159 }, 00:25:30.159 { 00:25:30.159 "name": "BaseBdev3", 00:25:30.159 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:30.159 "is_configured": true, 00:25:30.159 "data_offset": 2048, 00:25:30.159 "data_size": 63488 00:25:30.159 }, 00:25:30.159 { 00:25:30.159 "name": "BaseBdev4", 00:25:30.159 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:30.159 "is_configured": true, 00:25:30.159 "data_offset": 2048, 00:25:30.159 "data_size": 63488 00:25:30.159 } 00:25:30.159 ] 00:25:30.159 }' 00:25:30.159 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.159 13:34:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:30.730 "name": "raid_bdev1", 00:25:30.730 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:30.730 "strip_size_kb": 0, 00:25:30.730 "state": "online", 00:25:30.730 "raid_level": "raid1", 00:25:30.730 "superblock": true, 00:25:30.730 "num_base_bdevs": 4, 00:25:30.730 "num_base_bdevs_discovered": 2, 00:25:30.730 "num_base_bdevs_operational": 2, 00:25:30.730 "base_bdevs_list": [ 00:25:30.730 { 00:25:30.730 "name": null, 00:25:30.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.730 "is_configured": false, 00:25:30.730 "data_offset": 2048, 00:25:30.730 "data_size": 63488 00:25:30.730 }, 00:25:30.730 { 00:25:30.730 "name": null, 00:25:30.730 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.730 "is_configured": false, 00:25:30.730 "data_offset": 2048, 00:25:30.730 "data_size": 63488 00:25:30.730 }, 00:25:30.730 { 00:25:30.730 "name": "BaseBdev3", 00:25:30.730 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:30.730 "is_configured": true, 00:25:30.730 "data_offset": 2048, 00:25:30.730 "data_size": 63488 00:25:30.730 }, 00:25:30.730 { 00:25:30.730 "name": "BaseBdev4", 00:25:30.730 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:30.730 "is_configured": true, 00:25:30.730 "data_offset": 2048, 00:25:30.730 "data_size": 63488 00:25:30.730 } 00:25:30.730 ] 00:25:30.730 }' 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:30.730 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:30.731 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:30.731 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:30.731 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:30.990 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:31.250 [2024-07-25 13:34:11.884596] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:31.250 [2024-07-25 13:34:11.884627] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:31.250 [2024-07-25 13:34:11.884639] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x104af40 00:25:31.250 [2024-07-25 13:34:11.884646] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:31.250 [2024-07-25 13:34:11.884926] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:31.250 [2024-07-25 13:34:11.884937] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:31.250 [2024-07-25 13:34:11.884983] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:31.250 [2024-07-25 13:34:11.884989] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:31.250 [2024-07-25 13:34:11.884995] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:31.250 BaseBdev1 00:25:31.250 13:34:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@789 -- # sleep 1 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.191 13:34:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.451 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:32.451 "name": "raid_bdev1", 00:25:32.451 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:32.451 "strip_size_kb": 0, 00:25:32.451 "state": "online", 00:25:32.451 "raid_level": "raid1", 00:25:32.451 "superblock": true, 00:25:32.451 "num_base_bdevs": 4, 00:25:32.451 "num_base_bdevs_discovered": 2, 00:25:32.451 "num_base_bdevs_operational": 2, 00:25:32.451 "base_bdevs_list": [ 00:25:32.451 { 00:25:32.451 "name": null, 00:25:32.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.451 "is_configured": false, 00:25:32.451 "data_offset": 2048, 00:25:32.451 "data_size": 63488 00:25:32.451 }, 00:25:32.451 { 00:25:32.451 "name": null, 00:25:32.451 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.451 "is_configured": false, 00:25:32.451 "data_offset": 2048, 00:25:32.451 "data_size": 63488 00:25:32.451 }, 00:25:32.451 { 00:25:32.451 "name": "BaseBdev3", 00:25:32.451 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:32.451 "is_configured": true, 00:25:32.451 "data_offset": 2048, 00:25:32.451 "data_size": 63488 00:25:32.451 }, 00:25:32.451 { 00:25:32.451 "name": "BaseBdev4", 00:25:32.451 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:32.451 "is_configured": true, 00:25:32.451 "data_offset": 2048, 00:25:32.451 "data_size": 63488 00:25:32.451 } 00:25:32.451 ] 00:25:32.451 }' 00:25:32.451 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:32.451 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:33.021 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:33.021 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:33.021 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:33.021 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:33.021 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:33.021 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.021 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:33.281 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:33.281 "name": "raid_bdev1", 00:25:33.281 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:33.281 "strip_size_kb": 0, 00:25:33.281 "state": "online", 00:25:33.281 "raid_level": "raid1", 00:25:33.281 "superblock": true, 00:25:33.281 "num_base_bdevs": 4, 00:25:33.281 "num_base_bdevs_discovered": 2, 00:25:33.281 "num_base_bdevs_operational": 2, 00:25:33.281 "base_bdevs_list": [ 00:25:33.281 { 00:25:33.281 "name": null, 00:25:33.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.281 "is_configured": false, 00:25:33.281 "data_offset": 2048, 00:25:33.281 "data_size": 63488 00:25:33.281 }, 00:25:33.281 { 00:25:33.281 "name": null, 00:25:33.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:33.281 "is_configured": false, 00:25:33.281 "data_offset": 2048, 00:25:33.281 "data_size": 63488 00:25:33.281 }, 00:25:33.281 { 00:25:33.281 "name": "BaseBdev3", 00:25:33.281 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:33.281 "is_configured": true, 00:25:33.281 "data_offset": 2048, 00:25:33.281 "data_size": 63488 00:25:33.281 }, 00:25:33.281 { 00:25:33.281 "name": "BaseBdev4", 00:25:33.281 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:33.281 "is_configured": true, 00:25:33.281 "data_offset": 2048, 00:25:33.281 "data_size": 63488 00:25:33.281 } 00:25:33.281 ] 00:25:33.281 }' 00:25:33.281 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:33.281 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:33.281 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:33.281 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:33.281 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:33.281 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@650 -- # local es=0 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:33.282 13:34:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:33.542 [2024-07-25 13:34:14.158593] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:33.542 [2024-07-25 13:34:14.158687] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:33.542 [2024-07-25 13:34:14.158695] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:33.542 request: 00:25:33.542 { 00:25:33.542 "base_bdev": "BaseBdev1", 00:25:33.542 "raid_bdev": "raid_bdev1", 00:25:33.542 "method": "bdev_raid_add_base_bdev", 00:25:33.542 "req_id": 1 00:25:33.542 } 00:25:33.542 Got JSON-RPC error response 00:25:33.542 response: 00:25:33.542 { 00:25:33.542 "code": -22, 00:25:33.542 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:33.542 } 00:25:33.542 13:34:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@653 -- # es=1 00:25:33.542 13:34:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:25:33.542 13:34:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:25:33.542 13:34:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:25:33.542 13:34:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@793 -- # sleep 1 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.485 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.774 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:34.774 "name": "raid_bdev1", 00:25:34.774 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:34.774 "strip_size_kb": 0, 00:25:34.774 "state": "online", 00:25:34.774 "raid_level": "raid1", 00:25:34.774 "superblock": true, 00:25:34.774 "num_base_bdevs": 4, 00:25:34.774 "num_base_bdevs_discovered": 2, 00:25:34.774 "num_base_bdevs_operational": 2, 00:25:34.774 "base_bdevs_list": [ 00:25:34.774 { 00:25:34.774 "name": null, 00:25:34.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.774 "is_configured": false, 00:25:34.774 "data_offset": 2048, 00:25:34.774 "data_size": 63488 00:25:34.774 }, 00:25:34.774 { 00:25:34.774 "name": null, 00:25:34.774 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.774 "is_configured": false, 00:25:34.774 "data_offset": 2048, 00:25:34.774 "data_size": 63488 00:25:34.774 }, 00:25:34.774 { 00:25:34.774 "name": "BaseBdev3", 00:25:34.774 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:34.774 "is_configured": true, 00:25:34.774 "data_offset": 2048, 00:25:34.774 "data_size": 63488 00:25:34.774 }, 00:25:34.774 { 00:25:34.774 "name": "BaseBdev4", 00:25:34.774 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:34.774 "is_configured": true, 00:25:34.774 "data_offset": 2048, 00:25:34.774 "data_size": 63488 00:25:34.774 } 00:25:34.774 ] 00:25:34.774 }' 00:25:34.774 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:34.774 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:35.353 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:35.353 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:35.353 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:35.353 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:35.353 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:35.353 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:35.353 13:34:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:35.353 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:35.353 "name": "raid_bdev1", 00:25:35.353 "uuid": "a8d8e8fb-7f6e-47d5-afbe-cfaf739b7224", 00:25:35.353 "strip_size_kb": 0, 00:25:35.353 "state": "online", 00:25:35.353 "raid_level": "raid1", 00:25:35.353 "superblock": true, 00:25:35.353 "num_base_bdevs": 4, 00:25:35.353 "num_base_bdevs_discovered": 2, 00:25:35.353 "num_base_bdevs_operational": 2, 00:25:35.353 "base_bdevs_list": [ 00:25:35.353 { 00:25:35.353 "name": null, 00:25:35.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.353 "is_configured": false, 00:25:35.353 "data_offset": 2048, 00:25:35.353 "data_size": 63488 00:25:35.353 }, 00:25:35.353 { 00:25:35.353 "name": null, 00:25:35.353 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:35.353 "is_configured": false, 00:25:35.353 "data_offset": 2048, 00:25:35.353 "data_size": 63488 00:25:35.353 }, 00:25:35.353 { 00:25:35.353 "name": "BaseBdev3", 00:25:35.353 "uuid": "17c31c5f-4b84-54db-a74e-56568e8aca0e", 00:25:35.353 "is_configured": true, 00:25:35.353 "data_offset": 2048, 00:25:35.353 "data_size": 63488 00:25:35.353 }, 00:25:35.353 { 00:25:35.353 "name": "BaseBdev4", 00:25:35.353 "uuid": "4270fb92-3990-5914-bccb-ffe9efca5957", 00:25:35.353 "is_configured": true, 00:25:35.353 "data_offset": 2048, 00:25:35.353 "data_size": 63488 00:25:35.353 } 00:25:35.353 ] 00:25:35.353 }' 00:25:35.353 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:35.613 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:35.613 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:35.613 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@798 -- # killprocess 1027651 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@950 -- # '[' -z 1027651 ']' 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # kill -0 1027651 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # uname 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1027651 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1027651' 00:25:35.614 killing process with pid 1027651 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@969 -- # kill 1027651 00:25:35.614 Received shutdown signal, test time was about 24.329137 seconds 00:25:35.614 00:25:35.614 Latency(us) 00:25:35.614 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:35.614 =================================================================================================================== 00:25:35.614 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:35.614 [2024-07-25 13:34:16.250508] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:35.614 [2024-07-25 13:34:16.250585] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:35.614 [2024-07-25 13:34:16.250628] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:35.614 [2024-07-25 13:34:16.250634] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1047540 name raid_bdev1, state offline 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@974 -- # wait 1027651 00:25:35.614 [2024-07-25 13:34:16.273846] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@800 -- # return 0 00:25:35.614 00:25:35.614 real 0m28.963s 00:25:35.614 user 0m45.864s 00:25:35.614 sys 0m3.502s 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:35.614 13:34:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:35.614 ************************************ 00:25:35.614 END TEST raid_rebuild_test_sb_io 00:25:35.614 ************************************ 00:25:35.874 13:34:16 bdev_raid -- bdev/bdev_raid.sh@964 -- # '[' n == y ']' 00:25:35.874 13:34:16 bdev_raid -- bdev/bdev_raid.sh@976 -- # base_blocklen=4096 00:25:35.874 13:34:16 bdev_raid -- bdev/bdev_raid.sh@978 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:25:35.874 13:34:16 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:25:35.874 13:34:16 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:35.874 13:34:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:35.874 ************************************ 00:25:35.874 START TEST raid_state_function_test_sb_4k 00:25:35.874 ************************************ 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=1033003 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1033003' 00:25:35.874 Process raid pid: 1033003 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 1033003 /var/tmp/spdk-raid.sock 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1033003 ']' 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:35.874 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:35.874 13:34:16 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:35.874 [2024-07-25 13:34:16.529905] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:25:35.874 [2024-07-25 13:34:16.529952] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:35.874 [2024-07-25 13:34:16.619629] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:36.135 [2024-07-25 13:34:16.684227] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:36.135 [2024-07-25 13:34:16.724542] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:36.135 [2024-07-25 13:34:16.724568] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:36.705 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:36.705 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:25:36.705 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:36.966 [2024-07-25 13:34:17.543404] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:36.966 [2024-07-25 13:34:17.543433] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:36.966 [2024-07-25 13:34:17.543439] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:36.966 [2024-07-25 13:34:17.543445] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:36.966 "name": "Existed_Raid", 00:25:36.966 "uuid": "f5945f90-d193-43af-bf3a-b89386827033", 00:25:36.966 "strip_size_kb": 0, 00:25:36.966 "state": "configuring", 00:25:36.966 "raid_level": "raid1", 00:25:36.966 "superblock": true, 00:25:36.966 "num_base_bdevs": 2, 00:25:36.966 "num_base_bdevs_discovered": 0, 00:25:36.966 "num_base_bdevs_operational": 2, 00:25:36.966 "base_bdevs_list": [ 00:25:36.966 { 00:25:36.966 "name": "BaseBdev1", 00:25:36.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.966 "is_configured": false, 00:25:36.966 "data_offset": 0, 00:25:36.966 "data_size": 0 00:25:36.966 }, 00:25:36.966 { 00:25:36.966 "name": "BaseBdev2", 00:25:36.966 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:36.966 "is_configured": false, 00:25:36.966 "data_offset": 0, 00:25:36.966 "data_size": 0 00:25:36.966 } 00:25:36.966 ] 00:25:36.966 }' 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:36.966 13:34:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:37.536 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:37.797 [2024-07-25 13:34:18.473648] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:37.797 [2024-07-25 13:34:18.473666] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17826b0 name Existed_Raid, state configuring 00:25:37.797 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:38.057 [2024-07-25 13:34:18.670166] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:38.057 [2024-07-25 13:34:18.670185] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:38.057 [2024-07-25 13:34:18.670190] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:38.057 [2024-07-25 13:34:18.670195] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:38.057 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:25:38.317 [2024-07-25 13:34:18.865190] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:38.317 BaseBdev1 00:25:38.317 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:38.317 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:25:38.317 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:25:38.317 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:25:38.317 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:25:38.317 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:25:38.317 13:34:18 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:38.317 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:38.578 [ 00:25:38.578 { 00:25:38.578 "name": "BaseBdev1", 00:25:38.578 "aliases": [ 00:25:38.578 "c135fb42-89be-456e-9682-e834f60fe3fc" 00:25:38.578 ], 00:25:38.578 "product_name": "Malloc disk", 00:25:38.578 "block_size": 4096, 00:25:38.578 "num_blocks": 8192, 00:25:38.578 "uuid": "c135fb42-89be-456e-9682-e834f60fe3fc", 00:25:38.578 "assigned_rate_limits": { 00:25:38.578 "rw_ios_per_sec": 0, 00:25:38.578 "rw_mbytes_per_sec": 0, 00:25:38.578 "r_mbytes_per_sec": 0, 00:25:38.578 "w_mbytes_per_sec": 0 00:25:38.578 }, 00:25:38.578 "claimed": true, 00:25:38.578 "claim_type": "exclusive_write", 00:25:38.578 "zoned": false, 00:25:38.578 "supported_io_types": { 00:25:38.578 "read": true, 00:25:38.578 "write": true, 00:25:38.578 "unmap": true, 00:25:38.578 "flush": true, 00:25:38.578 "reset": true, 00:25:38.578 "nvme_admin": false, 00:25:38.578 "nvme_io": false, 00:25:38.578 "nvme_io_md": false, 00:25:38.578 "write_zeroes": true, 00:25:38.578 "zcopy": true, 00:25:38.578 "get_zone_info": false, 00:25:38.578 "zone_management": false, 00:25:38.578 "zone_append": false, 00:25:38.578 "compare": false, 00:25:38.578 "compare_and_write": false, 00:25:38.578 "abort": true, 00:25:38.578 "seek_hole": false, 00:25:38.578 "seek_data": false, 00:25:38.578 "copy": true, 00:25:38.578 "nvme_iov_md": false 00:25:38.578 }, 00:25:38.578 "memory_domains": [ 00:25:38.578 { 00:25:38.578 "dma_device_id": "system", 00:25:38.578 "dma_device_type": 1 00:25:38.578 }, 00:25:38.578 { 00:25:38.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:38.578 "dma_device_type": 2 00:25:38.578 } 00:25:38.578 ], 00:25:38.578 "driver_specific": {} 00:25:38.578 } 00:25:38.578 ] 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:38.578 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:38.839 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:38.839 "name": "Existed_Raid", 00:25:38.839 "uuid": "ae67963d-3f89-43b4-81c7-c0eab8252e93", 00:25:38.839 "strip_size_kb": 0, 00:25:38.839 "state": "configuring", 00:25:38.839 "raid_level": "raid1", 00:25:38.839 "superblock": true, 00:25:38.839 "num_base_bdevs": 2, 00:25:38.839 "num_base_bdevs_discovered": 1, 00:25:38.839 "num_base_bdevs_operational": 2, 00:25:38.839 "base_bdevs_list": [ 00:25:38.839 { 00:25:38.839 "name": "BaseBdev1", 00:25:38.839 "uuid": "c135fb42-89be-456e-9682-e834f60fe3fc", 00:25:38.839 "is_configured": true, 00:25:38.839 "data_offset": 256, 00:25:38.839 "data_size": 7936 00:25:38.839 }, 00:25:38.839 { 00:25:38.839 "name": "BaseBdev2", 00:25:38.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:38.839 "is_configured": false, 00:25:38.839 "data_offset": 0, 00:25:38.839 "data_size": 0 00:25:38.839 } 00:25:38.839 ] 00:25:38.839 }' 00:25:38.839 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:38.839 13:34:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:39.409 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:39.409 [2024-07-25 13:34:20.188533] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:39.410 [2024-07-25 13:34:20.188568] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1781fa0 name Existed_Raid, state configuring 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:39.670 [2024-07-25 13:34:20.381046] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:39.670 [2024-07-25 13:34:20.382174] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:39.670 [2024-07-25 13:34:20.382196] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.670 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:39.930 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.930 "name": "Existed_Raid", 00:25:39.930 "uuid": "0f67d64b-d8c6-4169-bfe3-55b73d6502e8", 00:25:39.930 "strip_size_kb": 0, 00:25:39.930 "state": "configuring", 00:25:39.930 "raid_level": "raid1", 00:25:39.930 "superblock": true, 00:25:39.930 "num_base_bdevs": 2, 00:25:39.930 "num_base_bdevs_discovered": 1, 00:25:39.930 "num_base_bdevs_operational": 2, 00:25:39.930 "base_bdevs_list": [ 00:25:39.930 { 00:25:39.930 "name": "BaseBdev1", 00:25:39.930 "uuid": "c135fb42-89be-456e-9682-e834f60fe3fc", 00:25:39.930 "is_configured": true, 00:25:39.930 "data_offset": 256, 00:25:39.930 "data_size": 7936 00:25:39.930 }, 00:25:39.930 { 00:25:39.930 "name": "BaseBdev2", 00:25:39.930 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:39.930 "is_configured": false, 00:25:39.930 "data_offset": 0, 00:25:39.930 "data_size": 0 00:25:39.930 } 00:25:39.930 ] 00:25:39.930 }' 00:25:39.930 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.930 13:34:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:40.500 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:25:40.761 [2024-07-25 13:34:21.340455] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:40.761 [2024-07-25 13:34:21.340570] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1782da0 00:25:40.761 [2024-07-25 13:34:21.340579] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:40.761 [2024-07-25 13:34:21.340717] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1781b00 00:25:40.761 [2024-07-25 13:34:21.340809] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1782da0 00:25:40.761 [2024-07-25 13:34:21.340815] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1782da0 00:25:40.761 [2024-07-25 13:34:21.340882] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:40.761 BaseBdev2 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # local i 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:40.761 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:41.022 [ 00:25:41.022 { 00:25:41.022 "name": "BaseBdev2", 00:25:41.022 "aliases": [ 00:25:41.022 "aeb59374-9b00-44fb-8861-44301d24549d" 00:25:41.022 ], 00:25:41.022 "product_name": "Malloc disk", 00:25:41.022 "block_size": 4096, 00:25:41.022 "num_blocks": 8192, 00:25:41.022 "uuid": "aeb59374-9b00-44fb-8861-44301d24549d", 00:25:41.022 "assigned_rate_limits": { 00:25:41.022 "rw_ios_per_sec": 0, 00:25:41.022 "rw_mbytes_per_sec": 0, 00:25:41.022 "r_mbytes_per_sec": 0, 00:25:41.022 "w_mbytes_per_sec": 0 00:25:41.022 }, 00:25:41.022 "claimed": true, 00:25:41.022 "claim_type": "exclusive_write", 00:25:41.022 "zoned": false, 00:25:41.022 "supported_io_types": { 00:25:41.022 "read": true, 00:25:41.022 "write": true, 00:25:41.022 "unmap": true, 00:25:41.022 "flush": true, 00:25:41.022 "reset": true, 00:25:41.022 "nvme_admin": false, 00:25:41.022 "nvme_io": false, 00:25:41.022 "nvme_io_md": false, 00:25:41.022 "write_zeroes": true, 00:25:41.022 "zcopy": true, 00:25:41.022 "get_zone_info": false, 00:25:41.022 "zone_management": false, 00:25:41.022 "zone_append": false, 00:25:41.022 "compare": false, 00:25:41.022 "compare_and_write": false, 00:25:41.022 "abort": true, 00:25:41.022 "seek_hole": false, 00:25:41.022 "seek_data": false, 00:25:41.022 "copy": true, 00:25:41.022 "nvme_iov_md": false 00:25:41.022 }, 00:25:41.022 "memory_domains": [ 00:25:41.022 { 00:25:41.022 "dma_device_id": "system", 00:25:41.022 "dma_device_type": 1 00:25:41.022 }, 00:25:41.022 { 00:25:41.022 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:41.022 "dma_device_type": 2 00:25:41.022 } 00:25:41.022 ], 00:25:41.022 "driver_specific": {} 00:25:41.022 } 00:25:41.022 ] 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@907 -- # return 0 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.022 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:41.283 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:41.283 "name": "Existed_Raid", 00:25:41.283 "uuid": "0f67d64b-d8c6-4169-bfe3-55b73d6502e8", 00:25:41.283 "strip_size_kb": 0, 00:25:41.283 "state": "online", 00:25:41.283 "raid_level": "raid1", 00:25:41.283 "superblock": true, 00:25:41.283 "num_base_bdevs": 2, 00:25:41.283 "num_base_bdevs_discovered": 2, 00:25:41.283 "num_base_bdevs_operational": 2, 00:25:41.283 "base_bdevs_list": [ 00:25:41.283 { 00:25:41.283 "name": "BaseBdev1", 00:25:41.283 "uuid": "c135fb42-89be-456e-9682-e834f60fe3fc", 00:25:41.283 "is_configured": true, 00:25:41.283 "data_offset": 256, 00:25:41.283 "data_size": 7936 00:25:41.283 }, 00:25:41.283 { 00:25:41.283 "name": "BaseBdev2", 00:25:41.283 "uuid": "aeb59374-9b00-44fb-8861-44301d24549d", 00:25:41.283 "is_configured": true, 00:25:41.283 "data_offset": 256, 00:25:41.283 "data_size": 7936 00:25:41.283 } 00:25:41.283 ] 00:25:41.283 }' 00:25:41.283 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:41.283 13:34:21 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:41.854 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:41.854 [2024-07-25 13:34:22.643955] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:42.115 "name": "Existed_Raid", 00:25:42.115 "aliases": [ 00:25:42.115 "0f67d64b-d8c6-4169-bfe3-55b73d6502e8" 00:25:42.115 ], 00:25:42.115 "product_name": "Raid Volume", 00:25:42.115 "block_size": 4096, 00:25:42.115 "num_blocks": 7936, 00:25:42.115 "uuid": "0f67d64b-d8c6-4169-bfe3-55b73d6502e8", 00:25:42.115 "assigned_rate_limits": { 00:25:42.115 "rw_ios_per_sec": 0, 00:25:42.115 "rw_mbytes_per_sec": 0, 00:25:42.115 "r_mbytes_per_sec": 0, 00:25:42.115 "w_mbytes_per_sec": 0 00:25:42.115 }, 00:25:42.115 "claimed": false, 00:25:42.115 "zoned": false, 00:25:42.115 "supported_io_types": { 00:25:42.115 "read": true, 00:25:42.115 "write": true, 00:25:42.115 "unmap": false, 00:25:42.115 "flush": false, 00:25:42.115 "reset": true, 00:25:42.115 "nvme_admin": false, 00:25:42.115 "nvme_io": false, 00:25:42.115 "nvme_io_md": false, 00:25:42.115 "write_zeroes": true, 00:25:42.115 "zcopy": false, 00:25:42.115 "get_zone_info": false, 00:25:42.115 "zone_management": false, 00:25:42.115 "zone_append": false, 00:25:42.115 "compare": false, 00:25:42.115 "compare_and_write": false, 00:25:42.115 "abort": false, 00:25:42.115 "seek_hole": false, 00:25:42.115 "seek_data": false, 00:25:42.115 "copy": false, 00:25:42.115 "nvme_iov_md": false 00:25:42.115 }, 00:25:42.115 "memory_domains": [ 00:25:42.115 { 00:25:42.115 "dma_device_id": "system", 00:25:42.115 "dma_device_type": 1 00:25:42.115 }, 00:25:42.115 { 00:25:42.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:42.115 "dma_device_type": 2 00:25:42.115 }, 00:25:42.115 { 00:25:42.115 "dma_device_id": "system", 00:25:42.115 "dma_device_type": 1 00:25:42.115 }, 00:25:42.115 { 00:25:42.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:42.115 "dma_device_type": 2 00:25:42.115 } 00:25:42.115 ], 00:25:42.115 "driver_specific": { 00:25:42.115 "raid": { 00:25:42.115 "uuid": "0f67d64b-d8c6-4169-bfe3-55b73d6502e8", 00:25:42.115 "strip_size_kb": 0, 00:25:42.115 "state": "online", 00:25:42.115 "raid_level": "raid1", 00:25:42.115 "superblock": true, 00:25:42.115 "num_base_bdevs": 2, 00:25:42.115 "num_base_bdevs_discovered": 2, 00:25:42.115 "num_base_bdevs_operational": 2, 00:25:42.115 "base_bdevs_list": [ 00:25:42.115 { 00:25:42.115 "name": "BaseBdev1", 00:25:42.115 "uuid": "c135fb42-89be-456e-9682-e834f60fe3fc", 00:25:42.115 "is_configured": true, 00:25:42.115 "data_offset": 256, 00:25:42.115 "data_size": 7936 00:25:42.115 }, 00:25:42.115 { 00:25:42.115 "name": "BaseBdev2", 00:25:42.115 "uuid": "aeb59374-9b00-44fb-8861-44301d24549d", 00:25:42.115 "is_configured": true, 00:25:42.115 "data_offset": 256, 00:25:42.115 "data_size": 7936 00:25:42.115 } 00:25:42.115 ] 00:25:42.115 } 00:25:42.115 } 00:25:42.115 }' 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:42.115 BaseBdev2' 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:42.115 "name": "BaseBdev1", 00:25:42.115 "aliases": [ 00:25:42.115 "c135fb42-89be-456e-9682-e834f60fe3fc" 00:25:42.115 ], 00:25:42.115 "product_name": "Malloc disk", 00:25:42.115 "block_size": 4096, 00:25:42.115 "num_blocks": 8192, 00:25:42.115 "uuid": "c135fb42-89be-456e-9682-e834f60fe3fc", 00:25:42.115 "assigned_rate_limits": { 00:25:42.115 "rw_ios_per_sec": 0, 00:25:42.115 "rw_mbytes_per_sec": 0, 00:25:42.115 "r_mbytes_per_sec": 0, 00:25:42.115 "w_mbytes_per_sec": 0 00:25:42.115 }, 00:25:42.115 "claimed": true, 00:25:42.115 "claim_type": "exclusive_write", 00:25:42.115 "zoned": false, 00:25:42.115 "supported_io_types": { 00:25:42.115 "read": true, 00:25:42.115 "write": true, 00:25:42.115 "unmap": true, 00:25:42.115 "flush": true, 00:25:42.115 "reset": true, 00:25:42.115 "nvme_admin": false, 00:25:42.115 "nvme_io": false, 00:25:42.115 "nvme_io_md": false, 00:25:42.115 "write_zeroes": true, 00:25:42.115 "zcopy": true, 00:25:42.115 "get_zone_info": false, 00:25:42.115 "zone_management": false, 00:25:42.115 "zone_append": false, 00:25:42.115 "compare": false, 00:25:42.115 "compare_and_write": false, 00:25:42.115 "abort": true, 00:25:42.115 "seek_hole": false, 00:25:42.115 "seek_data": false, 00:25:42.115 "copy": true, 00:25:42.115 "nvme_iov_md": false 00:25:42.115 }, 00:25:42.115 "memory_domains": [ 00:25:42.115 { 00:25:42.115 "dma_device_id": "system", 00:25:42.115 "dma_device_type": 1 00:25:42.115 }, 00:25:42.115 { 00:25:42.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:42.115 "dma_device_type": 2 00:25:42.115 } 00:25:42.115 ], 00:25:42.115 "driver_specific": {} 00:25:42.115 }' 00:25:42.115 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:42.375 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:42.375 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:42.375 13:34:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:42.375 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:42.375 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:42.375 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:42.375 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:42.375 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:42.634 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:42.634 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:42.634 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:42.634 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:42.634 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:42.634 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:42.894 "name": "BaseBdev2", 00:25:42.894 "aliases": [ 00:25:42.894 "aeb59374-9b00-44fb-8861-44301d24549d" 00:25:42.894 ], 00:25:42.894 "product_name": "Malloc disk", 00:25:42.894 "block_size": 4096, 00:25:42.894 "num_blocks": 8192, 00:25:42.894 "uuid": "aeb59374-9b00-44fb-8861-44301d24549d", 00:25:42.894 "assigned_rate_limits": { 00:25:42.894 "rw_ios_per_sec": 0, 00:25:42.894 "rw_mbytes_per_sec": 0, 00:25:42.894 "r_mbytes_per_sec": 0, 00:25:42.894 "w_mbytes_per_sec": 0 00:25:42.894 }, 00:25:42.894 "claimed": true, 00:25:42.894 "claim_type": "exclusive_write", 00:25:42.894 "zoned": false, 00:25:42.894 "supported_io_types": { 00:25:42.894 "read": true, 00:25:42.894 "write": true, 00:25:42.894 "unmap": true, 00:25:42.894 "flush": true, 00:25:42.894 "reset": true, 00:25:42.894 "nvme_admin": false, 00:25:42.894 "nvme_io": false, 00:25:42.894 "nvme_io_md": false, 00:25:42.894 "write_zeroes": true, 00:25:42.894 "zcopy": true, 00:25:42.894 "get_zone_info": false, 00:25:42.894 "zone_management": false, 00:25:42.894 "zone_append": false, 00:25:42.894 "compare": false, 00:25:42.894 "compare_and_write": false, 00:25:42.894 "abort": true, 00:25:42.894 "seek_hole": false, 00:25:42.894 "seek_data": false, 00:25:42.894 "copy": true, 00:25:42.894 "nvme_iov_md": false 00:25:42.894 }, 00:25:42.894 "memory_domains": [ 00:25:42.894 { 00:25:42.894 "dma_device_id": "system", 00:25:42.894 "dma_device_type": 1 00:25:42.894 }, 00:25:42.894 { 00:25:42.894 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:42.894 "dma_device_type": 2 00:25:42.894 } 00:25:42.894 ], 00:25:42.894 "driver_specific": {} 00:25:42.894 }' 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:42.894 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:43.154 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:43.154 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:43.154 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:43.154 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:43.154 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:43.415 [2024-07-25 13:34:23.959098] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.415 13:34:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:43.415 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:43.415 "name": "Existed_Raid", 00:25:43.415 "uuid": "0f67d64b-d8c6-4169-bfe3-55b73d6502e8", 00:25:43.415 "strip_size_kb": 0, 00:25:43.415 "state": "online", 00:25:43.415 "raid_level": "raid1", 00:25:43.415 "superblock": true, 00:25:43.415 "num_base_bdevs": 2, 00:25:43.415 "num_base_bdevs_discovered": 1, 00:25:43.415 "num_base_bdevs_operational": 1, 00:25:43.415 "base_bdevs_list": [ 00:25:43.415 { 00:25:43.415 "name": null, 00:25:43.415 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:43.415 "is_configured": false, 00:25:43.415 "data_offset": 256, 00:25:43.415 "data_size": 7936 00:25:43.415 }, 00:25:43.415 { 00:25:43.415 "name": "BaseBdev2", 00:25:43.415 "uuid": "aeb59374-9b00-44fb-8861-44301d24549d", 00:25:43.415 "is_configured": true, 00:25:43.415 "data_offset": 256, 00:25:43.415 "data_size": 7936 00:25:43.415 } 00:25:43.415 ] 00:25:43.415 }' 00:25:43.415 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:43.415 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:43.986 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:43.986 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:43.986 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:43.986 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:44.246 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:44.246 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:44.246 13:34:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:44.506 [2024-07-25 13:34:25.057884] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:44.506 [2024-07-25 13:34:25.057944] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:44.506 [2024-07-25 13:34:25.063971] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:44.506 [2024-07-25 13:34:25.063996] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:44.506 [2024-07-25 13:34:25.064002] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1782da0 name Existed_Raid, state offline 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 1033003 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1033003 ']' 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1033003 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:44.506 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1033003 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1033003' 00:25:44.767 killing process with pid 1033003 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1033003 00:25:44.767 [2024-07-25 13:34:25.325986] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1033003 00:25:44.767 [2024-07-25 13:34:25.326581] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:25:44.767 00:25:44.767 real 0m8.975s 00:25:44.767 user 0m16.241s 00:25:44.767 sys 0m1.442s 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:44.767 13:34:25 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:44.767 ************************************ 00:25:44.767 END TEST raid_state_function_test_sb_4k 00:25:44.767 ************************************ 00:25:44.767 13:34:25 bdev_raid -- bdev/bdev_raid.sh@979 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:25:44.767 13:34:25 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:25:44.767 13:34:25 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:44.767 13:34:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:44.767 ************************************ 00:25:44.767 START TEST raid_superblock_test_4k 00:25:44.767 ************************************ 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@414 -- # local strip_size 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:25:44.767 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@427 -- # raid_pid=1034747 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@428 -- # waitforlisten 1034747 /var/tmp/spdk-raid.sock 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@831 -- # '[' -z 1034747 ']' 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:44.768 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:44.768 13:34:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:45.027 [2024-07-25 13:34:25.577239] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:25:45.027 [2024-07-25 13:34:25.577286] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1034747 ] 00:25:45.027 [2024-07-25 13:34:25.663798] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:45.027 [2024-07-25 13:34:25.728140] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:45.027 [2024-07-25 13:34:25.767374] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:45.027 [2024-07-25 13:34:25.767396] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@864 -- # return 0 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:25:45.968 malloc1 00:25:45.968 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:46.228 [2024-07-25 13:34:26.785759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:46.228 [2024-07-25 13:34:26.785793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:46.228 [2024-07-25 13:34:26.785804] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27219b0 00:25:46.228 [2024-07-25 13:34:26.785811] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:46.228 [2024-07-25 13:34:26.787094] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:46.228 [2024-07-25 13:34:26.787113] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:46.228 pt1 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:25:46.228 malloc2 00:25:46.228 13:34:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:46.488 [2024-07-25 13:34:27.172801] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:46.488 [2024-07-25 13:34:27.172829] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:46.488 [2024-07-25 13:34:27.172838] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2722db0 00:25:46.488 [2024-07-25 13:34:27.172846] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:46.488 [2024-07-25 13:34:27.174076] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:46.488 [2024-07-25 13:34:27.174095] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:46.488 pt2 00:25:46.488 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:25:46.488 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:25:46.488 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:46.749 [2024-07-25 13:34:27.349251] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:46.749 [2024-07-25 13:34:27.350242] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:46.749 [2024-07-25 13:34:27.350345] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x28c56b0 00:25:46.749 [2024-07-25 13:34:27.350353] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:46.749 [2024-07-25 13:34:27.350502] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x271aca0 00:25:46.749 [2024-07-25 13:34:27.350618] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28c56b0 00:25:46.749 [2024-07-25 13:34:27.350624] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28c56b0 00:25:46.749 [2024-07-25 13:34:27.350703] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:46.749 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.008 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:47.008 "name": "raid_bdev1", 00:25:47.008 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:47.008 "strip_size_kb": 0, 00:25:47.008 "state": "online", 00:25:47.008 "raid_level": "raid1", 00:25:47.008 "superblock": true, 00:25:47.008 "num_base_bdevs": 2, 00:25:47.008 "num_base_bdevs_discovered": 2, 00:25:47.008 "num_base_bdevs_operational": 2, 00:25:47.008 "base_bdevs_list": [ 00:25:47.008 { 00:25:47.008 "name": "pt1", 00:25:47.008 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:47.008 "is_configured": true, 00:25:47.008 "data_offset": 256, 00:25:47.008 "data_size": 7936 00:25:47.008 }, 00:25:47.008 { 00:25:47.008 "name": "pt2", 00:25:47.008 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:47.008 "is_configured": true, 00:25:47.008 "data_offset": 256, 00:25:47.008 "data_size": 7936 00:25:47.008 } 00:25:47.008 ] 00:25:47.008 }' 00:25:47.008 13:34:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:47.008 13:34:27 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:47.577 [2024-07-25 13:34:28.283787] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:47.577 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:47.577 "name": "raid_bdev1", 00:25:47.577 "aliases": [ 00:25:47.577 "0836fcf2-4683-4256-aa5c-6321c4b74d51" 00:25:47.577 ], 00:25:47.577 "product_name": "Raid Volume", 00:25:47.577 "block_size": 4096, 00:25:47.577 "num_blocks": 7936, 00:25:47.577 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:47.577 "assigned_rate_limits": { 00:25:47.577 "rw_ios_per_sec": 0, 00:25:47.577 "rw_mbytes_per_sec": 0, 00:25:47.577 "r_mbytes_per_sec": 0, 00:25:47.577 "w_mbytes_per_sec": 0 00:25:47.577 }, 00:25:47.577 "claimed": false, 00:25:47.577 "zoned": false, 00:25:47.577 "supported_io_types": { 00:25:47.577 "read": true, 00:25:47.577 "write": true, 00:25:47.577 "unmap": false, 00:25:47.577 "flush": false, 00:25:47.577 "reset": true, 00:25:47.577 "nvme_admin": false, 00:25:47.577 "nvme_io": false, 00:25:47.577 "nvme_io_md": false, 00:25:47.577 "write_zeroes": true, 00:25:47.577 "zcopy": false, 00:25:47.577 "get_zone_info": false, 00:25:47.577 "zone_management": false, 00:25:47.577 "zone_append": false, 00:25:47.577 "compare": false, 00:25:47.577 "compare_and_write": false, 00:25:47.577 "abort": false, 00:25:47.577 "seek_hole": false, 00:25:47.577 "seek_data": false, 00:25:47.577 "copy": false, 00:25:47.577 "nvme_iov_md": false 00:25:47.577 }, 00:25:47.577 "memory_domains": [ 00:25:47.577 { 00:25:47.577 "dma_device_id": "system", 00:25:47.577 "dma_device_type": 1 00:25:47.577 }, 00:25:47.577 { 00:25:47.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:47.578 "dma_device_type": 2 00:25:47.578 }, 00:25:47.578 { 00:25:47.578 "dma_device_id": "system", 00:25:47.578 "dma_device_type": 1 00:25:47.578 }, 00:25:47.578 { 00:25:47.578 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:47.578 "dma_device_type": 2 00:25:47.578 } 00:25:47.578 ], 00:25:47.578 "driver_specific": { 00:25:47.578 "raid": { 00:25:47.578 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:47.578 "strip_size_kb": 0, 00:25:47.578 "state": "online", 00:25:47.578 "raid_level": "raid1", 00:25:47.578 "superblock": true, 00:25:47.578 "num_base_bdevs": 2, 00:25:47.578 "num_base_bdevs_discovered": 2, 00:25:47.578 "num_base_bdevs_operational": 2, 00:25:47.578 "base_bdevs_list": [ 00:25:47.578 { 00:25:47.578 "name": "pt1", 00:25:47.578 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:47.578 "is_configured": true, 00:25:47.578 "data_offset": 256, 00:25:47.578 "data_size": 7936 00:25:47.578 }, 00:25:47.578 { 00:25:47.578 "name": "pt2", 00:25:47.578 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:47.578 "is_configured": true, 00:25:47.578 "data_offset": 256, 00:25:47.578 "data_size": 7936 00:25:47.578 } 00:25:47.578 ] 00:25:47.578 } 00:25:47.578 } 00:25:47.578 }' 00:25:47.578 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:47.578 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:47.578 pt2' 00:25:47.578 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:47.578 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:47.578 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:47.836 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:47.837 "name": "pt1", 00:25:47.837 "aliases": [ 00:25:47.837 "00000000-0000-0000-0000-000000000001" 00:25:47.837 ], 00:25:47.837 "product_name": "passthru", 00:25:47.837 "block_size": 4096, 00:25:47.837 "num_blocks": 8192, 00:25:47.837 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:47.837 "assigned_rate_limits": { 00:25:47.837 "rw_ios_per_sec": 0, 00:25:47.837 "rw_mbytes_per_sec": 0, 00:25:47.837 "r_mbytes_per_sec": 0, 00:25:47.837 "w_mbytes_per_sec": 0 00:25:47.837 }, 00:25:47.837 "claimed": true, 00:25:47.837 "claim_type": "exclusive_write", 00:25:47.837 "zoned": false, 00:25:47.837 "supported_io_types": { 00:25:47.837 "read": true, 00:25:47.837 "write": true, 00:25:47.837 "unmap": true, 00:25:47.837 "flush": true, 00:25:47.837 "reset": true, 00:25:47.837 "nvme_admin": false, 00:25:47.837 "nvme_io": false, 00:25:47.837 "nvme_io_md": false, 00:25:47.837 "write_zeroes": true, 00:25:47.837 "zcopy": true, 00:25:47.837 "get_zone_info": false, 00:25:47.837 "zone_management": false, 00:25:47.837 "zone_append": false, 00:25:47.837 "compare": false, 00:25:47.837 "compare_and_write": false, 00:25:47.837 "abort": true, 00:25:47.837 "seek_hole": false, 00:25:47.837 "seek_data": false, 00:25:47.837 "copy": true, 00:25:47.837 "nvme_iov_md": false 00:25:47.837 }, 00:25:47.837 "memory_domains": [ 00:25:47.837 { 00:25:47.837 "dma_device_id": "system", 00:25:47.837 "dma_device_type": 1 00:25:47.837 }, 00:25:47.837 { 00:25:47.837 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:47.837 "dma_device_type": 2 00:25:47.837 } 00:25:47.837 ], 00:25:47.837 "driver_specific": { 00:25:47.837 "passthru": { 00:25:47.837 "name": "pt1", 00:25:47.837 "base_bdev_name": "malloc1" 00:25:47.837 } 00:25:47.837 } 00:25:47.837 }' 00:25:47.837 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:47.837 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:48.096 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:48.356 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:48.356 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:48.356 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:48.356 13:34:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:48.356 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:48.356 "name": "pt2", 00:25:48.356 "aliases": [ 00:25:48.356 "00000000-0000-0000-0000-000000000002" 00:25:48.356 ], 00:25:48.356 "product_name": "passthru", 00:25:48.356 "block_size": 4096, 00:25:48.356 "num_blocks": 8192, 00:25:48.356 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:48.356 "assigned_rate_limits": { 00:25:48.356 "rw_ios_per_sec": 0, 00:25:48.356 "rw_mbytes_per_sec": 0, 00:25:48.356 "r_mbytes_per_sec": 0, 00:25:48.356 "w_mbytes_per_sec": 0 00:25:48.356 }, 00:25:48.356 "claimed": true, 00:25:48.356 "claim_type": "exclusive_write", 00:25:48.356 "zoned": false, 00:25:48.356 "supported_io_types": { 00:25:48.356 "read": true, 00:25:48.356 "write": true, 00:25:48.356 "unmap": true, 00:25:48.356 "flush": true, 00:25:48.356 "reset": true, 00:25:48.356 "nvme_admin": false, 00:25:48.356 "nvme_io": false, 00:25:48.356 "nvme_io_md": false, 00:25:48.356 "write_zeroes": true, 00:25:48.356 "zcopy": true, 00:25:48.356 "get_zone_info": false, 00:25:48.356 "zone_management": false, 00:25:48.356 "zone_append": false, 00:25:48.356 "compare": false, 00:25:48.356 "compare_and_write": false, 00:25:48.356 "abort": true, 00:25:48.356 "seek_hole": false, 00:25:48.356 "seek_data": false, 00:25:48.356 "copy": true, 00:25:48.356 "nvme_iov_md": false 00:25:48.356 }, 00:25:48.356 "memory_domains": [ 00:25:48.356 { 00:25:48.356 "dma_device_id": "system", 00:25:48.356 "dma_device_type": 1 00:25:48.356 }, 00:25:48.356 { 00:25:48.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:48.356 "dma_device_type": 2 00:25:48.356 } 00:25:48.356 ], 00:25:48.356 "driver_specific": { 00:25:48.356 "passthru": { 00:25:48.356 "name": "pt2", 00:25:48.356 "base_bdev_name": "malloc2" 00:25:48.356 } 00:25:48.356 } 00:25:48.356 }' 00:25:48.356 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:48.616 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:48.876 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:48.876 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:48.876 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:48.876 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:25:48.876 [2024-07-25 13:34:29.623163] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:48.876 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=0836fcf2-4683-4256-aa5c-6321c4b74d51 00:25:48.876 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@451 -- # '[' -z 0836fcf2-4683-4256-aa5c-6321c4b74d51 ']' 00:25:48.876 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:49.136 [2024-07-25 13:34:29.815451] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:49.136 [2024-07-25 13:34:29.815464] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:49.136 [2024-07-25 13:34:29.815504] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:49.136 [2024-07-25 13:34:29.815544] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:49.136 [2024-07-25 13:34:29.815556] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28c56b0 name raid_bdev1, state offline 00:25:49.136 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:49.136 13:34:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:25:49.396 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:25:49.396 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:25:49.396 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:25:49.396 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:49.656 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:25:49.656 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:49.656 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:49.656 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@650 -- # local es=0 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:25:49.915 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:49.916 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:49.916 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:50.176 [2024-07-25 13:34:30.785878] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:50.176 [2024-07-25 13:34:30.786946] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:50.176 [2024-07-25 13:34:30.786989] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:50.176 [2024-07-25 13:34:30.787016] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:50.176 [2024-07-25 13:34:30.787027] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:50.176 [2024-07-25 13:34:30.787032] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2721e50 name raid_bdev1, state configuring 00:25:50.176 request: 00:25:50.176 { 00:25:50.176 "name": "raid_bdev1", 00:25:50.176 "raid_level": "raid1", 00:25:50.176 "base_bdevs": [ 00:25:50.176 "malloc1", 00:25:50.176 "malloc2" 00:25:50.176 ], 00:25:50.176 "superblock": false, 00:25:50.176 "method": "bdev_raid_create", 00:25:50.176 "req_id": 1 00:25:50.176 } 00:25:50.176 Got JSON-RPC error response 00:25:50.176 response: 00:25:50.176 { 00:25:50.177 "code": -17, 00:25:50.177 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:50.177 } 00:25:50.177 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@653 -- # es=1 00:25:50.177 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:25:50.177 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:25:50.177 13:34:30 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:25:50.177 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.177 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:25:50.437 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:25:50.437 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:25:50.437 13:34:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:50.437 [2024-07-25 13:34:31.170812] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:50.437 [2024-07-25 13:34:31.170835] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:50.437 [2024-07-25 13:34:31.170846] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2721be0 00:25:50.437 [2024-07-25 13:34:31.170853] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:50.437 [2024-07-25 13:34:31.172112] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:50.437 [2024-07-25 13:34:31.172131] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:50.437 [2024-07-25 13:34:31.172174] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:50.437 [2024-07-25 13:34:31.172193] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:50.437 pt1 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.437 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.697 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:50.697 "name": "raid_bdev1", 00:25:50.697 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:50.697 "strip_size_kb": 0, 00:25:50.697 "state": "configuring", 00:25:50.697 "raid_level": "raid1", 00:25:50.697 "superblock": true, 00:25:50.697 "num_base_bdevs": 2, 00:25:50.697 "num_base_bdevs_discovered": 1, 00:25:50.697 "num_base_bdevs_operational": 2, 00:25:50.697 "base_bdevs_list": [ 00:25:50.697 { 00:25:50.697 "name": "pt1", 00:25:50.697 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:50.697 "is_configured": true, 00:25:50.697 "data_offset": 256, 00:25:50.697 "data_size": 7936 00:25:50.697 }, 00:25:50.697 { 00:25:50.697 "name": null, 00:25:50.697 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:50.697 "is_configured": false, 00:25:50.697 "data_offset": 256, 00:25:50.697 "data_size": 7936 00:25:50.697 } 00:25:50.697 ] 00:25:50.697 }' 00:25:50.697 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:50.697 13:34:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:51.266 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:25:51.266 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:25:51.267 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:25:51.267 13:34:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:51.527 [2024-07-25 13:34:32.097167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:51.527 [2024-07-25 13:34:32.097195] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:51.527 [2024-07-25 13:34:32.097204] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28c5430 00:25:51.527 [2024-07-25 13:34:32.097210] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:51.527 [2024-07-25 13:34:32.097480] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:51.527 [2024-07-25 13:34:32.097492] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:51.527 [2024-07-25 13:34:32.097534] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:51.527 [2024-07-25 13:34:32.097554] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:51.527 [2024-07-25 13:34:32.097630] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x28ba100 00:25:51.527 [2024-07-25 13:34:32.097637] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:51.527 [2024-07-25 13:34:32.097772] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2719b30 00:25:51.527 [2024-07-25 13:34:32.097870] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28ba100 00:25:51.527 [2024-07-25 13:34:32.097876] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28ba100 00:25:51.527 [2024-07-25 13:34:32.097946] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:51.527 pt2 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:51.527 "name": "raid_bdev1", 00:25:51.527 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:51.527 "strip_size_kb": 0, 00:25:51.527 "state": "online", 00:25:51.527 "raid_level": "raid1", 00:25:51.527 "superblock": true, 00:25:51.527 "num_base_bdevs": 2, 00:25:51.527 "num_base_bdevs_discovered": 2, 00:25:51.527 "num_base_bdevs_operational": 2, 00:25:51.527 "base_bdevs_list": [ 00:25:51.527 { 00:25:51.527 "name": "pt1", 00:25:51.527 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:51.527 "is_configured": true, 00:25:51.527 "data_offset": 256, 00:25:51.527 "data_size": 7936 00:25:51.527 }, 00:25:51.527 { 00:25:51.527 "name": "pt2", 00:25:51.527 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:51.527 "is_configured": true, 00:25:51.527 "data_offset": 256, 00:25:51.527 "data_size": 7936 00:25:51.527 } 00:25:51.527 ] 00:25:51.527 }' 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:51.527 13:34:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:52.096 13:34:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:52.356 [2024-07-25 13:34:33.043767] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:52.356 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:52.356 "name": "raid_bdev1", 00:25:52.356 "aliases": [ 00:25:52.356 "0836fcf2-4683-4256-aa5c-6321c4b74d51" 00:25:52.356 ], 00:25:52.356 "product_name": "Raid Volume", 00:25:52.356 "block_size": 4096, 00:25:52.356 "num_blocks": 7936, 00:25:52.356 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:52.356 "assigned_rate_limits": { 00:25:52.356 "rw_ios_per_sec": 0, 00:25:52.356 "rw_mbytes_per_sec": 0, 00:25:52.356 "r_mbytes_per_sec": 0, 00:25:52.356 "w_mbytes_per_sec": 0 00:25:52.356 }, 00:25:52.356 "claimed": false, 00:25:52.357 "zoned": false, 00:25:52.357 "supported_io_types": { 00:25:52.357 "read": true, 00:25:52.357 "write": true, 00:25:52.357 "unmap": false, 00:25:52.357 "flush": false, 00:25:52.357 "reset": true, 00:25:52.357 "nvme_admin": false, 00:25:52.357 "nvme_io": false, 00:25:52.357 "nvme_io_md": false, 00:25:52.357 "write_zeroes": true, 00:25:52.357 "zcopy": false, 00:25:52.357 "get_zone_info": false, 00:25:52.357 "zone_management": false, 00:25:52.357 "zone_append": false, 00:25:52.357 "compare": false, 00:25:52.357 "compare_and_write": false, 00:25:52.357 "abort": false, 00:25:52.357 "seek_hole": false, 00:25:52.357 "seek_data": false, 00:25:52.357 "copy": false, 00:25:52.357 "nvme_iov_md": false 00:25:52.357 }, 00:25:52.357 "memory_domains": [ 00:25:52.357 { 00:25:52.357 "dma_device_id": "system", 00:25:52.357 "dma_device_type": 1 00:25:52.357 }, 00:25:52.357 { 00:25:52.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:52.357 "dma_device_type": 2 00:25:52.357 }, 00:25:52.357 { 00:25:52.357 "dma_device_id": "system", 00:25:52.357 "dma_device_type": 1 00:25:52.357 }, 00:25:52.357 { 00:25:52.357 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:52.357 "dma_device_type": 2 00:25:52.357 } 00:25:52.357 ], 00:25:52.357 "driver_specific": { 00:25:52.357 "raid": { 00:25:52.357 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:52.357 "strip_size_kb": 0, 00:25:52.357 "state": "online", 00:25:52.357 "raid_level": "raid1", 00:25:52.357 "superblock": true, 00:25:52.357 "num_base_bdevs": 2, 00:25:52.357 "num_base_bdevs_discovered": 2, 00:25:52.357 "num_base_bdevs_operational": 2, 00:25:52.357 "base_bdevs_list": [ 00:25:52.357 { 00:25:52.357 "name": "pt1", 00:25:52.357 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:52.357 "is_configured": true, 00:25:52.357 "data_offset": 256, 00:25:52.357 "data_size": 7936 00:25:52.357 }, 00:25:52.357 { 00:25:52.357 "name": "pt2", 00:25:52.357 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:52.357 "is_configured": true, 00:25:52.357 "data_offset": 256, 00:25:52.357 "data_size": 7936 00:25:52.357 } 00:25:52.357 ] 00:25:52.357 } 00:25:52.357 } 00:25:52.357 }' 00:25:52.357 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:52.357 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:52.357 pt2' 00:25:52.357 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:52.357 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:52.357 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:52.617 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:52.617 "name": "pt1", 00:25:52.617 "aliases": [ 00:25:52.617 "00000000-0000-0000-0000-000000000001" 00:25:52.617 ], 00:25:52.617 "product_name": "passthru", 00:25:52.617 "block_size": 4096, 00:25:52.617 "num_blocks": 8192, 00:25:52.617 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:52.617 "assigned_rate_limits": { 00:25:52.617 "rw_ios_per_sec": 0, 00:25:52.617 "rw_mbytes_per_sec": 0, 00:25:52.617 "r_mbytes_per_sec": 0, 00:25:52.617 "w_mbytes_per_sec": 0 00:25:52.617 }, 00:25:52.617 "claimed": true, 00:25:52.617 "claim_type": "exclusive_write", 00:25:52.617 "zoned": false, 00:25:52.617 "supported_io_types": { 00:25:52.617 "read": true, 00:25:52.617 "write": true, 00:25:52.617 "unmap": true, 00:25:52.617 "flush": true, 00:25:52.617 "reset": true, 00:25:52.617 "nvme_admin": false, 00:25:52.617 "nvme_io": false, 00:25:52.617 "nvme_io_md": false, 00:25:52.617 "write_zeroes": true, 00:25:52.617 "zcopy": true, 00:25:52.617 "get_zone_info": false, 00:25:52.617 "zone_management": false, 00:25:52.617 "zone_append": false, 00:25:52.617 "compare": false, 00:25:52.617 "compare_and_write": false, 00:25:52.617 "abort": true, 00:25:52.617 "seek_hole": false, 00:25:52.617 "seek_data": false, 00:25:52.617 "copy": true, 00:25:52.617 "nvme_iov_md": false 00:25:52.617 }, 00:25:52.617 "memory_domains": [ 00:25:52.617 { 00:25:52.617 "dma_device_id": "system", 00:25:52.617 "dma_device_type": 1 00:25:52.617 }, 00:25:52.617 { 00:25:52.617 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:52.617 "dma_device_type": 2 00:25:52.617 } 00:25:52.617 ], 00:25:52.617 "driver_specific": { 00:25:52.617 "passthru": { 00:25:52.617 "name": "pt1", 00:25:52.617 "base_bdev_name": "malloc1" 00:25:52.617 } 00:25:52.617 } 00:25:52.617 }' 00:25:52.617 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:52.617 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:52.617 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:52.617 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:52.879 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:52.879 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:52.879 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:52.879 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:52.879 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:52.879 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:52.879 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:53.139 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:53.139 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:53.139 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:53.139 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:53.139 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:53.139 "name": "pt2", 00:25:53.139 "aliases": [ 00:25:53.139 "00000000-0000-0000-0000-000000000002" 00:25:53.139 ], 00:25:53.139 "product_name": "passthru", 00:25:53.139 "block_size": 4096, 00:25:53.139 "num_blocks": 8192, 00:25:53.139 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:53.139 "assigned_rate_limits": { 00:25:53.139 "rw_ios_per_sec": 0, 00:25:53.139 "rw_mbytes_per_sec": 0, 00:25:53.139 "r_mbytes_per_sec": 0, 00:25:53.139 "w_mbytes_per_sec": 0 00:25:53.139 }, 00:25:53.139 "claimed": true, 00:25:53.139 "claim_type": "exclusive_write", 00:25:53.139 "zoned": false, 00:25:53.139 "supported_io_types": { 00:25:53.140 "read": true, 00:25:53.140 "write": true, 00:25:53.140 "unmap": true, 00:25:53.140 "flush": true, 00:25:53.140 "reset": true, 00:25:53.140 "nvme_admin": false, 00:25:53.140 "nvme_io": false, 00:25:53.140 "nvme_io_md": false, 00:25:53.140 "write_zeroes": true, 00:25:53.140 "zcopy": true, 00:25:53.140 "get_zone_info": false, 00:25:53.140 "zone_management": false, 00:25:53.140 "zone_append": false, 00:25:53.140 "compare": false, 00:25:53.140 "compare_and_write": false, 00:25:53.140 "abort": true, 00:25:53.140 "seek_hole": false, 00:25:53.140 "seek_data": false, 00:25:53.140 "copy": true, 00:25:53.140 "nvme_iov_md": false 00:25:53.140 }, 00:25:53.140 "memory_domains": [ 00:25:53.140 { 00:25:53.140 "dma_device_id": "system", 00:25:53.140 "dma_device_type": 1 00:25:53.140 }, 00:25:53.140 { 00:25:53.140 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:53.140 "dma_device_type": 2 00:25:53.140 } 00:25:53.140 ], 00:25:53.140 "driver_specific": { 00:25:53.140 "passthru": { 00:25:53.140 "name": "pt2", 00:25:53.140 "base_bdev_name": "malloc2" 00:25:53.140 } 00:25:53.140 } 00:25:53.140 }' 00:25:53.140 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:53.140 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:53.401 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:53.401 13:34:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:53.401 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:53.401 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:53.401 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:53.401 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:53.401 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:53.401 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:53.401 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:25:53.661 [2024-07-25 13:34:34.391171] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@502 -- # '[' 0836fcf2-4683-4256-aa5c-6321c4b74d51 '!=' 0836fcf2-4683-4256-aa5c-6321c4b74d51 ']' 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:25:53.661 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:53.922 [2024-07-25 13:34:34.579466] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.922 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.181 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:54.181 "name": "raid_bdev1", 00:25:54.181 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:54.181 "strip_size_kb": 0, 00:25:54.181 "state": "online", 00:25:54.181 "raid_level": "raid1", 00:25:54.181 "superblock": true, 00:25:54.181 "num_base_bdevs": 2, 00:25:54.181 "num_base_bdevs_discovered": 1, 00:25:54.181 "num_base_bdevs_operational": 1, 00:25:54.181 "base_bdevs_list": [ 00:25:54.181 { 00:25:54.182 "name": null, 00:25:54.182 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:54.182 "is_configured": false, 00:25:54.182 "data_offset": 256, 00:25:54.182 "data_size": 7936 00:25:54.182 }, 00:25:54.182 { 00:25:54.182 "name": "pt2", 00:25:54.182 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:54.182 "is_configured": true, 00:25:54.182 "data_offset": 256, 00:25:54.182 "data_size": 7936 00:25:54.182 } 00:25:54.182 ] 00:25:54.182 }' 00:25:54.182 13:34:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:54.182 13:34:34 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:54.751 13:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:54.751 [2024-07-25 13:34:35.505806] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:54.751 [2024-07-25 13:34:35.505821] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:54.751 [2024-07-25 13:34:35.505855] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:54.751 [2024-07-25 13:34:35.505883] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:54.751 [2024-07-25 13:34:35.505889] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28ba100 name raid_bdev1, state offline 00:25:54.751 13:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.751 13:34:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:25:55.320 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:25:55.321 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:25:55.321 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:25:55.321 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:25:55.321 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:55.581 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:25:55.581 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:25:55.581 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:25:55.581 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:25:55.581 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@534 -- # i=1 00:25:55.581 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:55.841 [2024-07-25 13:34:36.452155] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:55.841 [2024-07-25 13:34:36.452186] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:55.841 [2024-07-25 13:34:36.452195] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28b9a10 00:25:55.841 [2024-07-25 13:34:36.452202] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:55.842 [2024-07-25 13:34:36.453471] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:55.842 [2024-07-25 13:34:36.453491] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:55.842 [2024-07-25 13:34:36.453539] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:55.842 [2024-07-25 13:34:36.453563] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:55.842 [2024-07-25 13:34:36.453625] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2719300 00:25:55.842 [2024-07-25 13:34:36.453631] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:55.842 [2024-07-25 13:34:36.453771] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x271b650 00:25:55.842 [2024-07-25 13:34:36.453866] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2719300 00:25:55.842 [2024-07-25 13:34:36.453871] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2719300 00:25:55.842 [2024-07-25 13:34:36.453942] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:55.842 pt2 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.842 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:56.102 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:56.102 "name": "raid_bdev1", 00:25:56.102 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:56.102 "strip_size_kb": 0, 00:25:56.102 "state": "online", 00:25:56.102 "raid_level": "raid1", 00:25:56.102 "superblock": true, 00:25:56.102 "num_base_bdevs": 2, 00:25:56.102 "num_base_bdevs_discovered": 1, 00:25:56.102 "num_base_bdevs_operational": 1, 00:25:56.102 "base_bdevs_list": [ 00:25:56.102 { 00:25:56.102 "name": null, 00:25:56.102 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:56.102 "is_configured": false, 00:25:56.102 "data_offset": 256, 00:25:56.102 "data_size": 7936 00:25:56.102 }, 00:25:56.102 { 00:25:56.102 "name": "pt2", 00:25:56.102 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:56.102 "is_configured": true, 00:25:56.102 "data_offset": 256, 00:25:56.102 "data_size": 7936 00:25:56.102 } 00:25:56.102 ] 00:25:56.102 }' 00:25:56.102 13:34:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:56.102 13:34:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:56.672 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:56.672 [2024-07-25 13:34:37.402541] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:56.672 [2024-07-25 13:34:37.402560] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:56.672 [2024-07-25 13:34:37.402592] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:56.672 [2024-07-25 13:34:37.402623] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:56.672 [2024-07-25 13:34:37.402629] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2719300 name raid_bdev1, state offline 00:25:56.672 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:56.672 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:25:56.932 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:25:56.932 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:25:56.932 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:25:56.932 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:57.192 [2024-07-25 13:34:37.795532] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:57.192 [2024-07-25 13:34:37.795567] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:57.192 [2024-07-25 13:34:37.795577] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2721e50 00:25:57.192 [2024-07-25 13:34:37.795583] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:57.192 [2024-07-25 13:34:37.796840] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:57.192 [2024-07-25 13:34:37.796859] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:57.192 [2024-07-25 13:34:37.796904] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:57.192 [2024-07-25 13:34:37.796922] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:57.192 [2024-07-25 13:34:37.796996] bdev_raid.c:3665:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:25:57.192 [2024-07-25 13:34:37.797003] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:57.192 [2024-07-25 13:34:37.797011] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x271bcf0 name raid_bdev1, state configuring 00:25:57.192 [2024-07-25 13:34:37.797025] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:57.192 [2024-07-25 13:34:37.797063] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x271c8a0 00:25:57.192 [2024-07-25 13:34:37.797069] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:57.192 [2024-07-25 13:34:37.797201] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28b9d30 00:25:57.192 [2024-07-25 13:34:37.797296] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x271c8a0 00:25:57.192 [2024-07-25 13:34:37.797301] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x271c8a0 00:25:57.192 [2024-07-25 13:34:37.797372] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:57.192 pt1 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.192 13:34:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.453 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:57.453 "name": "raid_bdev1", 00:25:57.453 "uuid": "0836fcf2-4683-4256-aa5c-6321c4b74d51", 00:25:57.453 "strip_size_kb": 0, 00:25:57.453 "state": "online", 00:25:57.453 "raid_level": "raid1", 00:25:57.453 "superblock": true, 00:25:57.453 "num_base_bdevs": 2, 00:25:57.453 "num_base_bdevs_discovered": 1, 00:25:57.453 "num_base_bdevs_operational": 1, 00:25:57.453 "base_bdevs_list": [ 00:25:57.453 { 00:25:57.453 "name": null, 00:25:57.453 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:57.453 "is_configured": false, 00:25:57.453 "data_offset": 256, 00:25:57.453 "data_size": 7936 00:25:57.453 }, 00:25:57.453 { 00:25:57.453 "name": "pt2", 00:25:57.453 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:57.453 "is_configured": true, 00:25:57.453 "data_offset": 256, 00:25:57.453 "data_size": 7936 00:25:57.453 } 00:25:57.453 ] 00:25:57.453 }' 00:25:57.453 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:57.453 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:58.023 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:25:58.023 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:25:58.023 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:25:58.023 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:58.023 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:25:58.282 [2024-07-25 13:34:38.930575] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@573 -- # '[' 0836fcf2-4683-4256-aa5c-6321c4b74d51 '!=' 0836fcf2-4683-4256-aa5c-6321c4b74d51 ']' 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@578 -- # killprocess 1034747 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@950 -- # '[' -z 1034747 ']' 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # kill -0 1034747 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # uname 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1034747 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1034747' 00:25:58.282 killing process with pid 1034747 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@969 -- # kill 1034747 00:25:58.282 [2024-07-25 13:34:39.001006] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:58.282 [2024-07-25 13:34:39.001044] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:58.282 [2024-07-25 13:34:39.001073] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:58.282 [2024-07-25 13:34:39.001078] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x271c8a0 name raid_bdev1, state offline 00:25:58.282 13:34:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@974 -- # wait 1034747 00:25:58.282 [2024-07-25 13:34:39.010281] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:58.556 13:34:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@580 -- # return 0 00:25:58.556 00:25:58.556 real 0m13.608s 00:25:58.556 user 0m25.170s 00:25:58.556 sys 0m2.080s 00:25:58.556 13:34:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:25:58.556 13:34:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:58.556 ************************************ 00:25:58.556 END TEST raid_superblock_test_4k 00:25:58.556 ************************************ 00:25:58.556 13:34:39 bdev_raid -- bdev/bdev_raid.sh@980 -- # '[' true = true ']' 00:25:58.556 13:34:39 bdev_raid -- bdev/bdev_raid.sh@981 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:25:58.556 13:34:39 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:25:58.556 13:34:39 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:25:58.556 13:34:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:58.556 ************************************ 00:25:58.556 START TEST raid_rebuild_test_sb_4k 00:25:58.556 ************************************ 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # local verify=true 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:58.556 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # local strip_size 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # local create_arg 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@594 -- # local data_offset 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # raid_pid=1037229 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@613 -- # waitforlisten 1037229 /var/tmp/spdk-raid.sock 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@831 -- # '[' -z 1037229 ']' 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@836 -- # local max_retries=100 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:58.557 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@840 -- # xtrace_disable 00:25:58.557 13:34:39 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:58.557 [2024-07-25 13:34:39.292413] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:25:58.557 [2024-07-25 13:34:39.292472] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1037229 ] 00:25:58.557 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:58.557 Zero copy mechanism will not be used. 00:25:58.875 [2024-07-25 13:34:39.380940] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:58.875 [2024-07-25 13:34:39.453679] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:25:58.875 [2024-07-25 13:34:39.492614] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:58.875 [2024-07-25 13:34:39.492638] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:59.447 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:25:59.447 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@864 -- # return 0 00:25:59.447 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:59.447 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:25:59.707 BaseBdev1_malloc 00:25:59.707 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:59.707 [2024-07-25 13:34:40.435147] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:59.707 [2024-07-25 13:34:40.435182] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:59.707 [2024-07-25 13:34:40.435197] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e8d10 00:25:59.708 [2024-07-25 13:34:40.435204] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:59.708 [2024-07-25 13:34:40.436464] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:59.708 [2024-07-25 13:34:40.436485] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:59.708 BaseBdev1 00:25:59.708 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:25:59.708 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:25:59.967 BaseBdev2_malloc 00:25:59.967 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:00.228 [2024-07-25 13:34:40.802081] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:00.228 [2024-07-25 13:34:40.802110] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:00.228 [2024-07-25 13:34:40.802123] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e96d0 00:26:00.228 [2024-07-25 13:34:40.802130] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:00.228 [2024-07-25 13:34:40.803293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:00.228 [2024-07-25 13:34:40.803312] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:00.228 BaseBdev2 00:26:00.228 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:26:00.228 spare_malloc 00:26:00.228 13:34:40 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:00.488 spare_delay 00:26:00.488 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:00.748 [2024-07-25 13:34:41.365274] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:00.748 [2024-07-25 13:34:41.365303] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:00.748 [2024-07-25 13:34:41.365315] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e0ac0 00:26:00.748 [2024-07-25 13:34:41.365321] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:00.748 [2024-07-25 13:34:41.366521] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:00.748 [2024-07-25 13:34:41.366540] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:00.748 spare 00:26:00.748 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:01.008 [2024-07-25 13:34:41.553765] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:01.008 [2024-07-25 13:34:41.554768] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:01.008 [2024-07-25 13:34:41.554874] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x7e1c70 00:26:01.008 [2024-07-25 13:34:41.554881] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:01.008 [2024-07-25 13:34:41.555029] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e9960 00:26:01.008 [2024-07-25 13:34:41.555136] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7e1c70 00:26:01.008 [2024-07-25 13:34:41.555142] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x7e1c70 00:26:01.008 [2024-07-25 13:34:41.555220] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.008 "name": "raid_bdev1", 00:26:01.008 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:01.008 "strip_size_kb": 0, 00:26:01.008 "state": "online", 00:26:01.008 "raid_level": "raid1", 00:26:01.008 "superblock": true, 00:26:01.008 "num_base_bdevs": 2, 00:26:01.008 "num_base_bdevs_discovered": 2, 00:26:01.008 "num_base_bdevs_operational": 2, 00:26:01.008 "base_bdevs_list": [ 00:26:01.008 { 00:26:01.008 "name": "BaseBdev1", 00:26:01.008 "uuid": "bd2c3614-0e92-5586-a32f-8d1e99283d83", 00:26:01.008 "is_configured": true, 00:26:01.008 "data_offset": 256, 00:26:01.008 "data_size": 7936 00:26:01.008 }, 00:26:01.008 { 00:26:01.008 "name": "BaseBdev2", 00:26:01.008 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:01.008 "is_configured": true, 00:26:01.008 "data_offset": 256, 00:26:01.008 "data_size": 7936 00:26:01.008 } 00:26:01.008 ] 00:26:01.008 }' 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.008 13:34:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:01.948 13:34:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:01.948 13:34:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:26:02.518 [2024-07-25 13:34:43.246218] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:02.518 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:26:02.518 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:02.518 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:02.783 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:03.046 [2024-07-25 13:34:43.651082] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e9960 00:26:03.046 /dev/nbd0 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:03.046 1+0 records in 00:26:03.046 1+0 records out 00:26:03.046 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280783 s, 14.6 MB/s 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:26:03.046 13:34:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:03.616 7936+0 records in 00:26:03.616 7936+0 records out 00:26:03.616 32505856 bytes (33 MB, 31 MiB) copied, 0.618406 s, 52.6 MB/s 00:26:03.616 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:03.616 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:03.616 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:03.616 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:03.616 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:03.616 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:03.616 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:03.877 [2024-07-25 13:34:44.533877] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:03.877 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:04.137 [2024-07-25 13:34:44.711954] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.137 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:04.137 "name": "raid_bdev1", 00:26:04.137 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:04.137 "strip_size_kb": 0, 00:26:04.137 "state": "online", 00:26:04.137 "raid_level": "raid1", 00:26:04.137 "superblock": true, 00:26:04.137 "num_base_bdevs": 2, 00:26:04.137 "num_base_bdevs_discovered": 1, 00:26:04.137 "num_base_bdevs_operational": 1, 00:26:04.137 "base_bdevs_list": [ 00:26:04.137 { 00:26:04.137 "name": null, 00:26:04.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.137 "is_configured": false, 00:26:04.137 "data_offset": 256, 00:26:04.137 "data_size": 7936 00:26:04.137 }, 00:26:04.137 { 00:26:04.137 "name": "BaseBdev2", 00:26:04.137 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:04.138 "is_configured": true, 00:26:04.138 "data_offset": 256, 00:26:04.138 "data_size": 7936 00:26:04.138 } 00:26:04.138 ] 00:26:04.138 }' 00:26:04.138 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:04.138 13:34:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:04.709 13:34:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:04.969 [2024-07-25 13:34:45.654346] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:04.969 [2024-07-25 13:34:45.657814] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e1500 00:26:04.969 [2024-07-25 13:34:45.659429] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:04.969 13:34:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:05.908 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:05.908 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:05.908 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:05.908 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:05.908 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:05.908 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:05.908 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.169 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:06.169 "name": "raid_bdev1", 00:26:06.169 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:06.169 "strip_size_kb": 0, 00:26:06.169 "state": "online", 00:26:06.169 "raid_level": "raid1", 00:26:06.169 "superblock": true, 00:26:06.169 "num_base_bdevs": 2, 00:26:06.169 "num_base_bdevs_discovered": 2, 00:26:06.169 "num_base_bdevs_operational": 2, 00:26:06.169 "process": { 00:26:06.169 "type": "rebuild", 00:26:06.169 "target": "spare", 00:26:06.169 "progress": { 00:26:06.169 "blocks": 2816, 00:26:06.169 "percent": 35 00:26:06.169 } 00:26:06.169 }, 00:26:06.169 "base_bdevs_list": [ 00:26:06.169 { 00:26:06.169 "name": "spare", 00:26:06.169 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:06.169 "is_configured": true, 00:26:06.169 "data_offset": 256, 00:26:06.169 "data_size": 7936 00:26:06.169 }, 00:26:06.169 { 00:26:06.169 "name": "BaseBdev2", 00:26:06.169 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:06.169 "is_configured": true, 00:26:06.169 "data_offset": 256, 00:26:06.169 "data_size": 7936 00:26:06.169 } 00:26:06.169 ] 00:26:06.169 }' 00:26:06.169 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:06.169 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:06.169 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:06.169 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:06.169 13:34:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:06.429 [2024-07-25 13:34:47.120194] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:06.429 [2024-07-25 13:34:47.168311] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:06.429 [2024-07-25 13:34:47.168347] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:06.429 [2024-07-25 13:34:47.168357] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:06.429 [2024-07-25 13:34:47.168361] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.429 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.430 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.690 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.690 "name": "raid_bdev1", 00:26:06.690 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:06.690 "strip_size_kb": 0, 00:26:06.690 "state": "online", 00:26:06.690 "raid_level": "raid1", 00:26:06.690 "superblock": true, 00:26:06.690 "num_base_bdevs": 2, 00:26:06.690 "num_base_bdevs_discovered": 1, 00:26:06.690 "num_base_bdevs_operational": 1, 00:26:06.690 "base_bdevs_list": [ 00:26:06.690 { 00:26:06.690 "name": null, 00:26:06.690 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.690 "is_configured": false, 00:26:06.690 "data_offset": 256, 00:26:06.690 "data_size": 7936 00:26:06.690 }, 00:26:06.690 { 00:26:06.690 "name": "BaseBdev2", 00:26:06.690 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:06.690 "is_configured": true, 00:26:06.690 "data_offset": 256, 00:26:06.690 "data_size": 7936 00:26:06.690 } 00:26:06.690 ] 00:26:06.690 }' 00:26:06.690 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.690 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:07.259 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:07.259 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:07.259 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:07.259 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:07.259 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:07.259 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:07.259 13:34:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.519 13:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.519 "name": "raid_bdev1", 00:26:07.519 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:07.519 "strip_size_kb": 0, 00:26:07.519 "state": "online", 00:26:07.519 "raid_level": "raid1", 00:26:07.519 "superblock": true, 00:26:07.519 "num_base_bdevs": 2, 00:26:07.519 "num_base_bdevs_discovered": 1, 00:26:07.519 "num_base_bdevs_operational": 1, 00:26:07.519 "base_bdevs_list": [ 00:26:07.519 { 00:26:07.519 "name": null, 00:26:07.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.519 "is_configured": false, 00:26:07.519 "data_offset": 256, 00:26:07.519 "data_size": 7936 00:26:07.519 }, 00:26:07.519 { 00:26:07.519 "name": "BaseBdev2", 00:26:07.519 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:07.519 "is_configured": true, 00:26:07.519 "data_offset": 256, 00:26:07.519 "data_size": 7936 00:26:07.519 } 00:26:07.519 ] 00:26:07.519 }' 00:26:07.519 13:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.519 13:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:07.519 13:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.519 13:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:07.519 13:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:07.779 [2024-07-25 13:34:48.411545] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:07.779 [2024-07-25 13:34:48.414776] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e0590 00:26:07.779 [2024-07-25 13:34:48.415910] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:07.779 13:34:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@678 -- # sleep 1 00:26:08.717 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:08.717 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.717 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:08.717 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:08.717 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.717 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.717 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:08.977 "name": "raid_bdev1", 00:26:08.977 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:08.977 "strip_size_kb": 0, 00:26:08.977 "state": "online", 00:26:08.977 "raid_level": "raid1", 00:26:08.977 "superblock": true, 00:26:08.977 "num_base_bdevs": 2, 00:26:08.977 "num_base_bdevs_discovered": 2, 00:26:08.977 "num_base_bdevs_operational": 2, 00:26:08.977 "process": { 00:26:08.977 "type": "rebuild", 00:26:08.977 "target": "spare", 00:26:08.977 "progress": { 00:26:08.977 "blocks": 2816, 00:26:08.977 "percent": 35 00:26:08.977 } 00:26:08.977 }, 00:26:08.977 "base_bdevs_list": [ 00:26:08.977 { 00:26:08.977 "name": "spare", 00:26:08.977 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:08.977 "is_configured": true, 00:26:08.977 "data_offset": 256, 00:26:08.977 "data_size": 7936 00:26:08.977 }, 00:26:08.977 { 00:26:08.977 "name": "BaseBdev2", 00:26:08.977 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:08.977 "is_configured": true, 00:26:08.977 "data_offset": 256, 00:26:08.977 "data_size": 7936 00:26:08.977 } 00:26:08.977 ] 00:26:08.977 }' 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:26:08.977 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # local timeout=946 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.977 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:09.237 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:09.237 "name": "raid_bdev1", 00:26:09.237 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:09.237 "strip_size_kb": 0, 00:26:09.237 "state": "online", 00:26:09.237 "raid_level": "raid1", 00:26:09.237 "superblock": true, 00:26:09.237 "num_base_bdevs": 2, 00:26:09.237 "num_base_bdevs_discovered": 2, 00:26:09.237 "num_base_bdevs_operational": 2, 00:26:09.237 "process": { 00:26:09.237 "type": "rebuild", 00:26:09.237 "target": "spare", 00:26:09.237 "progress": { 00:26:09.237 "blocks": 3584, 00:26:09.237 "percent": 45 00:26:09.237 } 00:26:09.237 }, 00:26:09.237 "base_bdevs_list": [ 00:26:09.237 { 00:26:09.237 "name": "spare", 00:26:09.237 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:09.237 "is_configured": true, 00:26:09.237 "data_offset": 256, 00:26:09.237 "data_size": 7936 00:26:09.237 }, 00:26:09.237 { 00:26:09.237 "name": "BaseBdev2", 00:26:09.237 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:09.237 "is_configured": true, 00:26:09.237 "data_offset": 256, 00:26:09.237 "data_size": 7936 00:26:09.237 } 00:26:09.237 ] 00:26:09.237 }' 00:26:09.237 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:09.237 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:09.237 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:09.237 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:09.237 13:34:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.619 13:34:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:10.619 13:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:10.619 "name": "raid_bdev1", 00:26:10.619 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:10.619 "strip_size_kb": 0, 00:26:10.619 "state": "online", 00:26:10.619 "raid_level": "raid1", 00:26:10.619 "superblock": true, 00:26:10.619 "num_base_bdevs": 2, 00:26:10.619 "num_base_bdevs_discovered": 2, 00:26:10.619 "num_base_bdevs_operational": 2, 00:26:10.619 "process": { 00:26:10.619 "type": "rebuild", 00:26:10.619 "target": "spare", 00:26:10.619 "progress": { 00:26:10.619 "blocks": 6912, 00:26:10.619 "percent": 87 00:26:10.619 } 00:26:10.619 }, 00:26:10.619 "base_bdevs_list": [ 00:26:10.619 { 00:26:10.619 "name": "spare", 00:26:10.619 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:10.619 "is_configured": true, 00:26:10.619 "data_offset": 256, 00:26:10.619 "data_size": 7936 00:26:10.619 }, 00:26:10.619 { 00:26:10.619 "name": "BaseBdev2", 00:26:10.619 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:10.619 "is_configured": true, 00:26:10.619 "data_offset": 256, 00:26:10.619 "data_size": 7936 00:26:10.619 } 00:26:10.619 ] 00:26:10.620 }' 00:26:10.620 13:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:10.620 13:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:10.620 13:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:10.620 13:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:10.620 13:34:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@726 -- # sleep 1 00:26:10.879 [2024-07-25 13:34:51.533917] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:10.879 [2024-07-25 13:34:51.533961] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:10.879 [2024-07-25 13:34:51.534024] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:11.819 "name": "raid_bdev1", 00:26:11.819 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:11.819 "strip_size_kb": 0, 00:26:11.819 "state": "online", 00:26:11.819 "raid_level": "raid1", 00:26:11.819 "superblock": true, 00:26:11.819 "num_base_bdevs": 2, 00:26:11.819 "num_base_bdevs_discovered": 2, 00:26:11.819 "num_base_bdevs_operational": 2, 00:26:11.819 "base_bdevs_list": [ 00:26:11.819 { 00:26:11.819 "name": "spare", 00:26:11.819 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:11.819 "is_configured": true, 00:26:11.819 "data_offset": 256, 00:26:11.819 "data_size": 7936 00:26:11.819 }, 00:26:11.819 { 00:26:11.819 "name": "BaseBdev2", 00:26:11.819 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:11.819 "is_configured": true, 00:26:11.819 "data_offset": 256, 00:26:11.819 "data_size": 7936 00:26:11.819 } 00:26:11.819 ] 00:26:11.819 }' 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:11.819 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@724 -- # break 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:12.079 "name": "raid_bdev1", 00:26:12.079 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:12.079 "strip_size_kb": 0, 00:26:12.079 "state": "online", 00:26:12.079 "raid_level": "raid1", 00:26:12.079 "superblock": true, 00:26:12.079 "num_base_bdevs": 2, 00:26:12.079 "num_base_bdevs_discovered": 2, 00:26:12.079 "num_base_bdevs_operational": 2, 00:26:12.079 "base_bdevs_list": [ 00:26:12.079 { 00:26:12.079 "name": "spare", 00:26:12.079 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:12.079 "is_configured": true, 00:26:12.079 "data_offset": 256, 00:26:12.079 "data_size": 7936 00:26:12.079 }, 00:26:12.079 { 00:26:12.079 "name": "BaseBdev2", 00:26:12.079 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:12.079 "is_configured": true, 00:26:12.079 "data_offset": 256, 00:26:12.079 "data_size": 7936 00:26:12.079 } 00:26:12.079 ] 00:26:12.079 }' 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:12.079 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:12.339 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.340 13:34:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:12.340 13:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:12.340 "name": "raid_bdev1", 00:26:12.340 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:12.340 "strip_size_kb": 0, 00:26:12.340 "state": "online", 00:26:12.340 "raid_level": "raid1", 00:26:12.340 "superblock": true, 00:26:12.340 "num_base_bdevs": 2, 00:26:12.340 "num_base_bdevs_discovered": 2, 00:26:12.340 "num_base_bdevs_operational": 2, 00:26:12.340 "base_bdevs_list": [ 00:26:12.340 { 00:26:12.340 "name": "spare", 00:26:12.340 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:12.340 "is_configured": true, 00:26:12.340 "data_offset": 256, 00:26:12.340 "data_size": 7936 00:26:12.340 }, 00:26:12.340 { 00:26:12.340 "name": "BaseBdev2", 00:26:12.340 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:12.340 "is_configured": true, 00:26:12.340 "data_offset": 256, 00:26:12.340 "data_size": 7936 00:26:12.340 } 00:26:12.340 ] 00:26:12.340 }' 00:26:12.340 13:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:12.340 13:34:53 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:12.908 13:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:13.167 [2024-07-25 13:34:53.847701] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:13.167 [2024-07-25 13:34:53.847719] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:13.167 [2024-07-25 13:34:53.847760] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:13.167 [2024-07-25 13:34:53.847800] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:13.167 [2024-07-25 13:34:53.847806] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7e1c70 name raid_bdev1, state offline 00:26:13.167 13:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.167 13:34:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # jq length 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:13.427 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:13.687 /dev/nbd0 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:13.687 1+0 records in 00:26:13.687 1+0 records out 00:26:13.687 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000148727 s, 27.5 MB/s 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:13.687 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:13.947 /dev/nbd1 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@869 -- # local i 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@873 -- # break 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:13.947 1+0 records in 00:26:13.947 1+0 records out 00:26:13.947 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027799 s, 14.7 MB/s 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # size=4096 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@889 -- # return 0 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:13.947 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:14.207 13:34:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:14.778 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:15.037 [2024-07-25 13:34:55.716205] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:15.037 [2024-07-25 13:34:55.716238] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:15.037 [2024-07-25 13:34:55.716250] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e0230 00:26:15.037 [2024-07-25 13:34:55.716258] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:15.037 [2024-07-25 13:34:55.717602] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:15.037 [2024-07-25 13:34:55.717627] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:15.037 [2024-07-25 13:34:55.717688] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:15.037 [2024-07-25 13:34:55.717708] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:15.037 [2024-07-25 13:34:55.717788] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:15.037 spare 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.037 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:15.037 [2024-07-25 13:34:55.818078] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x7e7f80 00:26:15.037 [2024-07-25 13:34:55.818087] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:15.037 [2024-07-25 13:34:55.818247] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e2020 00:26:15.037 [2024-07-25 13:34:55.818366] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x7e7f80 00:26:15.037 [2024-07-25 13:34:55.818372] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x7e7f80 00:26:15.037 [2024-07-25 13:34:55.818453] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:15.296 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:15.296 "name": "raid_bdev1", 00:26:15.296 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:15.296 "strip_size_kb": 0, 00:26:15.296 "state": "online", 00:26:15.296 "raid_level": "raid1", 00:26:15.296 "superblock": true, 00:26:15.296 "num_base_bdevs": 2, 00:26:15.296 "num_base_bdevs_discovered": 2, 00:26:15.296 "num_base_bdevs_operational": 2, 00:26:15.296 "base_bdevs_list": [ 00:26:15.296 { 00:26:15.296 "name": "spare", 00:26:15.296 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:15.296 "is_configured": true, 00:26:15.296 "data_offset": 256, 00:26:15.296 "data_size": 7936 00:26:15.296 }, 00:26:15.296 { 00:26:15.296 "name": "BaseBdev2", 00:26:15.297 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:15.297 "is_configured": true, 00:26:15.297 "data_offset": 256, 00:26:15.297 "data_size": 7936 00:26:15.297 } 00:26:15.297 ] 00:26:15.297 }' 00:26:15.297 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:15.297 13:34:55 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:15.865 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:15.865 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:15.865 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:15.865 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:15.866 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:15.866 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:15.866 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.126 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:16.126 "name": "raid_bdev1", 00:26:16.126 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:16.126 "strip_size_kb": 0, 00:26:16.126 "state": "online", 00:26:16.126 "raid_level": "raid1", 00:26:16.126 "superblock": true, 00:26:16.126 "num_base_bdevs": 2, 00:26:16.126 "num_base_bdevs_discovered": 2, 00:26:16.126 "num_base_bdevs_operational": 2, 00:26:16.126 "base_bdevs_list": [ 00:26:16.126 { 00:26:16.126 "name": "spare", 00:26:16.126 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:16.126 "is_configured": true, 00:26:16.126 "data_offset": 256, 00:26:16.126 "data_size": 7936 00:26:16.126 }, 00:26:16.126 { 00:26:16.126 "name": "BaseBdev2", 00:26:16.126 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:16.126 "is_configured": true, 00:26:16.126 "data_offset": 256, 00:26:16.126 "data_size": 7936 00:26:16.126 } 00:26:16.126 ] 00:26:16.126 }' 00:26:16.126 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:16.126 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:16.126 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:16.126 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:16.126 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.126 13:34:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:16.385 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:26:16.385 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:16.645 [2024-07-25 13:34:57.212063] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:16.645 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.646 "name": "raid_bdev1", 00:26:16.646 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:16.646 "strip_size_kb": 0, 00:26:16.646 "state": "online", 00:26:16.646 "raid_level": "raid1", 00:26:16.646 "superblock": true, 00:26:16.646 "num_base_bdevs": 2, 00:26:16.646 "num_base_bdevs_discovered": 1, 00:26:16.646 "num_base_bdevs_operational": 1, 00:26:16.646 "base_bdevs_list": [ 00:26:16.646 { 00:26:16.646 "name": null, 00:26:16.646 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.646 "is_configured": false, 00:26:16.646 "data_offset": 256, 00:26:16.646 "data_size": 7936 00:26:16.646 }, 00:26:16.646 { 00:26:16.646 "name": "BaseBdev2", 00:26:16.646 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:16.646 "is_configured": true, 00:26:16.646 "data_offset": 256, 00:26:16.646 "data_size": 7936 00:26:16.646 } 00:26:16.646 ] 00:26:16.646 }' 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.646 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:17.215 13:34:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:17.475 [2024-07-25 13:34:58.130392] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:17.475 [2024-07-25 13:34:58.130497] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:17.476 [2024-07-25 13:34:58.130507] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:17.476 [2024-07-25 13:34:58.130525] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:17.476 [2024-07-25 13:34:58.133826] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e4de0 00:26:17.476 [2024-07-25 13:34:58.134892] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:17.476 13:34:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # sleep 1 00:26:18.416 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:18.416 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:18.416 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:18.416 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:18.416 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:18.416 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.416 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:18.676 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:18.676 "name": "raid_bdev1", 00:26:18.676 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:18.676 "strip_size_kb": 0, 00:26:18.676 "state": "online", 00:26:18.676 "raid_level": "raid1", 00:26:18.676 "superblock": true, 00:26:18.676 "num_base_bdevs": 2, 00:26:18.676 "num_base_bdevs_discovered": 2, 00:26:18.676 "num_base_bdevs_operational": 2, 00:26:18.676 "process": { 00:26:18.676 "type": "rebuild", 00:26:18.676 "target": "spare", 00:26:18.676 "progress": { 00:26:18.676 "blocks": 2816, 00:26:18.676 "percent": 35 00:26:18.676 } 00:26:18.676 }, 00:26:18.676 "base_bdevs_list": [ 00:26:18.676 { 00:26:18.676 "name": "spare", 00:26:18.676 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:18.676 "is_configured": true, 00:26:18.676 "data_offset": 256, 00:26:18.676 "data_size": 7936 00:26:18.676 }, 00:26:18.676 { 00:26:18.676 "name": "BaseBdev2", 00:26:18.676 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:18.676 "is_configured": true, 00:26:18.676 "data_offset": 256, 00:26:18.676 "data_size": 7936 00:26:18.676 } 00:26:18.676 ] 00:26:18.676 }' 00:26:18.676 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:18.676 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:18.676 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:18.676 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:18.676 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:18.936 [2024-07-25 13:34:59.623180] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:18.936 [2024-07-25 13:34:59.643792] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:18.936 [2024-07-25 13:34:59.643820] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:18.936 [2024-07-25 13:34:59.643830] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:18.936 [2024-07-25 13:34:59.643834] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:18.936 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:18.937 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:19.196 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:19.196 "name": "raid_bdev1", 00:26:19.196 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:19.196 "strip_size_kb": 0, 00:26:19.196 "state": "online", 00:26:19.196 "raid_level": "raid1", 00:26:19.196 "superblock": true, 00:26:19.196 "num_base_bdevs": 2, 00:26:19.196 "num_base_bdevs_discovered": 1, 00:26:19.196 "num_base_bdevs_operational": 1, 00:26:19.196 "base_bdevs_list": [ 00:26:19.196 { 00:26:19.196 "name": null, 00:26:19.196 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:19.196 "is_configured": false, 00:26:19.196 "data_offset": 256, 00:26:19.196 "data_size": 7936 00:26:19.196 }, 00:26:19.196 { 00:26:19.196 "name": "BaseBdev2", 00:26:19.196 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:19.196 "is_configured": true, 00:26:19.196 "data_offset": 256, 00:26:19.196 "data_size": 7936 00:26:19.196 } 00:26:19.196 ] 00:26:19.196 }' 00:26:19.196 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:19.196 13:34:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:19.766 13:35:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:20.025 [2024-07-25 13:35:00.650167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:20.025 [2024-07-25 13:35:00.650207] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.025 [2024-07-25 13:35:00.650223] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x7e3960 00:26:20.025 [2024-07-25 13:35:00.650230] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.025 [2024-07-25 13:35:00.650540] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.025 [2024-07-25 13:35:00.650557] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:20.025 [2024-07-25 13:35:00.650614] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:20.025 [2024-07-25 13:35:00.650622] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:20.025 [2024-07-25 13:35:00.650628] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:20.025 [2024-07-25 13:35:00.650646] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:20.025 [2024-07-25 13:35:00.653898] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7e9960 00:26:20.025 [2024-07-25 13:35:00.654966] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:20.025 spare 00:26:20.025 13:35:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # sleep 1 00:26:20.964 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:20.964 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:20.964 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:20.964 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:20.964 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:20.964 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.964 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.223 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:21.223 "name": "raid_bdev1", 00:26:21.223 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:21.223 "strip_size_kb": 0, 00:26:21.223 "state": "online", 00:26:21.223 "raid_level": "raid1", 00:26:21.223 "superblock": true, 00:26:21.223 "num_base_bdevs": 2, 00:26:21.223 "num_base_bdevs_discovered": 2, 00:26:21.224 "num_base_bdevs_operational": 2, 00:26:21.224 "process": { 00:26:21.224 "type": "rebuild", 00:26:21.224 "target": "spare", 00:26:21.224 "progress": { 00:26:21.224 "blocks": 2816, 00:26:21.224 "percent": 35 00:26:21.224 } 00:26:21.224 }, 00:26:21.224 "base_bdevs_list": [ 00:26:21.224 { 00:26:21.224 "name": "spare", 00:26:21.224 "uuid": "f2e3d3c5-5898-5ddb-acb3-fd47001e5470", 00:26:21.224 "is_configured": true, 00:26:21.224 "data_offset": 256, 00:26:21.224 "data_size": 7936 00:26:21.224 }, 00:26:21.224 { 00:26:21.224 "name": "BaseBdev2", 00:26:21.224 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:21.224 "is_configured": true, 00:26:21.224 "data_offset": 256, 00:26:21.224 "data_size": 7936 00:26:21.224 } 00:26:21.224 ] 00:26:21.224 }' 00:26:21.224 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:21.224 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:21.224 13:35:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:21.483 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:21.483 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:21.483 [2024-07-25 13:35:02.191894] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:21.483 [2024-07-25 13:35:02.264373] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:21.483 [2024-07-25 13:35:02.264405] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:21.483 [2024-07-25 13:35:02.264414] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:21.483 [2024-07-25 13:35:02.264418] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:21.743 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:22.002 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:22.002 "name": "raid_bdev1", 00:26:22.002 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:22.002 "strip_size_kb": 0, 00:26:22.002 "state": "online", 00:26:22.002 "raid_level": "raid1", 00:26:22.002 "superblock": true, 00:26:22.002 "num_base_bdevs": 2, 00:26:22.002 "num_base_bdevs_discovered": 1, 00:26:22.002 "num_base_bdevs_operational": 1, 00:26:22.002 "base_bdevs_list": [ 00:26:22.002 { 00:26:22.002 "name": null, 00:26:22.002 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:22.002 "is_configured": false, 00:26:22.002 "data_offset": 256, 00:26:22.002 "data_size": 7936 00:26:22.002 }, 00:26:22.002 { 00:26:22.002 "name": "BaseBdev2", 00:26:22.002 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:22.002 "is_configured": true, 00:26:22.002 "data_offset": 256, 00:26:22.002 "data_size": 7936 00:26:22.002 } 00:26:22.002 ] 00:26:22.002 }' 00:26:22.002 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:22.002 13:35:02 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:22.572 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:22.572 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:22.572 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:22.572 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:22.572 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:22.572 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:22.572 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:23.140 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:23.140 "name": "raid_bdev1", 00:26:23.140 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:23.140 "strip_size_kb": 0, 00:26:23.140 "state": "online", 00:26:23.140 "raid_level": "raid1", 00:26:23.140 "superblock": true, 00:26:23.140 "num_base_bdevs": 2, 00:26:23.140 "num_base_bdevs_discovered": 1, 00:26:23.140 "num_base_bdevs_operational": 1, 00:26:23.140 "base_bdevs_list": [ 00:26:23.140 { 00:26:23.140 "name": null, 00:26:23.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:23.140 "is_configured": false, 00:26:23.140 "data_offset": 256, 00:26:23.140 "data_size": 7936 00:26:23.141 }, 00:26:23.141 { 00:26:23.141 "name": "BaseBdev2", 00:26:23.141 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:23.141 "is_configured": true, 00:26:23.141 "data_offset": 256, 00:26:23.141 "data_size": 7936 00:26:23.141 } 00:26:23.141 ] 00:26:23.141 }' 00:26:23.141 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:23.141 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:23.141 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:23.141 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:23.141 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:23.141 13:35:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:23.400 [2024-07-25 13:35:04.076698] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:23.400 [2024-07-25 13:35:04.076728] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:23.400 [2024-07-25 13:35:04.076741] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x8aa990 00:26:23.400 [2024-07-25 13:35:04.076752] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:23.400 [2024-07-25 13:35:04.077026] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:23.400 [2024-07-25 13:35:04.077038] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:23.400 [2024-07-25 13:35:04.077082] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:23.400 [2024-07-25 13:35:04.077090] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:23.400 [2024-07-25 13:35:04.077096] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:23.400 BaseBdev1 00:26:23.400 13:35:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@789 -- # sleep 1 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.382 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:24.951 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:24.951 "name": "raid_bdev1", 00:26:24.951 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:24.951 "strip_size_kb": 0, 00:26:24.951 "state": "online", 00:26:24.951 "raid_level": "raid1", 00:26:24.951 "superblock": true, 00:26:24.951 "num_base_bdevs": 2, 00:26:24.951 "num_base_bdevs_discovered": 1, 00:26:24.951 "num_base_bdevs_operational": 1, 00:26:24.951 "base_bdevs_list": [ 00:26:24.951 { 00:26:24.951 "name": null, 00:26:24.951 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:24.951 "is_configured": false, 00:26:24.951 "data_offset": 256, 00:26:24.951 "data_size": 7936 00:26:24.951 }, 00:26:24.951 { 00:26:24.951 "name": "BaseBdev2", 00:26:24.951 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:24.951 "is_configured": true, 00:26:24.951 "data_offset": 256, 00:26:24.951 "data_size": 7936 00:26:24.951 } 00:26:24.951 ] 00:26:24.951 }' 00:26:24.951 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:24.951 13:35:05 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:25.522 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:25.522 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:25.522 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:25.522 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:25.522 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:25.781 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.781 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.781 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:25.781 "name": "raid_bdev1", 00:26:25.781 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:25.781 "strip_size_kb": 0, 00:26:25.781 "state": "online", 00:26:25.781 "raid_level": "raid1", 00:26:25.781 "superblock": true, 00:26:25.781 "num_base_bdevs": 2, 00:26:25.781 "num_base_bdevs_discovered": 1, 00:26:25.781 "num_base_bdevs_operational": 1, 00:26:25.781 "base_bdevs_list": [ 00:26:25.781 { 00:26:25.781 "name": null, 00:26:25.781 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:25.781 "is_configured": false, 00:26:25.781 "data_offset": 256, 00:26:25.781 "data_size": 7936 00:26:25.781 }, 00:26:25.781 { 00:26:25.781 "name": "BaseBdev2", 00:26:25.781 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:25.781 "is_configured": true, 00:26:25.781 "data_offset": 256, 00:26:25.781 "data_size": 7936 00:26:25.781 } 00:26:25.781 ] 00:26:25.781 }' 00:26:25.781 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:25.781 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:25.781 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@650 -- # local es=0 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:26.041 [2024-07-25 13:35:06.811632] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:26.041 [2024-07-25 13:35:06.811726] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:26.041 [2024-07-25 13:35:06.811734] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:26.041 request: 00:26:26.041 { 00:26:26.041 "base_bdev": "BaseBdev1", 00:26:26.041 "raid_bdev": "raid_bdev1", 00:26:26.041 "method": "bdev_raid_add_base_bdev", 00:26:26.041 "req_id": 1 00:26:26.041 } 00:26:26.041 Got JSON-RPC error response 00:26:26.041 response: 00:26:26.041 { 00:26:26.041 "code": -22, 00:26:26.041 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:26.041 } 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@653 -- # es=1 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:26:26.041 13:35:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@793 -- # sleep 1 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.424 13:35:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.424 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:27.424 "name": "raid_bdev1", 00:26:27.424 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:27.424 "strip_size_kb": 0, 00:26:27.424 "state": "online", 00:26:27.424 "raid_level": "raid1", 00:26:27.424 "superblock": true, 00:26:27.424 "num_base_bdevs": 2, 00:26:27.424 "num_base_bdevs_discovered": 1, 00:26:27.424 "num_base_bdevs_operational": 1, 00:26:27.424 "base_bdevs_list": [ 00:26:27.424 { 00:26:27.424 "name": null, 00:26:27.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.424 "is_configured": false, 00:26:27.424 "data_offset": 256, 00:26:27.424 "data_size": 7936 00:26:27.424 }, 00:26:27.424 { 00:26:27.424 "name": "BaseBdev2", 00:26:27.424 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:27.424 "is_configured": true, 00:26:27.424 "data_offset": 256, 00:26:27.424 "data_size": 7936 00:26:27.424 } 00:26:27.424 ] 00:26:27.424 }' 00:26:27.424 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:27.424 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:27.992 "name": "raid_bdev1", 00:26:27.992 "uuid": "de3536cb-c493-4ae4-90fa-29071b2b9940", 00:26:27.992 "strip_size_kb": 0, 00:26:27.992 "state": "online", 00:26:27.992 "raid_level": "raid1", 00:26:27.992 "superblock": true, 00:26:27.992 "num_base_bdevs": 2, 00:26:27.992 "num_base_bdevs_discovered": 1, 00:26:27.992 "num_base_bdevs_operational": 1, 00:26:27.992 "base_bdevs_list": [ 00:26:27.992 { 00:26:27.992 "name": null, 00:26:27.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:27.992 "is_configured": false, 00:26:27.992 "data_offset": 256, 00:26:27.992 "data_size": 7936 00:26:27.992 }, 00:26:27.992 { 00:26:27.992 "name": "BaseBdev2", 00:26:27.992 "uuid": "64e695cb-ac9c-56a7-9489-288812528119", 00:26:27.992 "is_configured": true, 00:26:27.992 "data_offset": 256, 00:26:27.992 "data_size": 7936 00:26:27.992 } 00:26:27.992 ] 00:26:27.992 }' 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:27.992 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@798 -- # killprocess 1037229 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@950 -- # '[' -z 1037229 ']' 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # kill -0 1037229 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # uname 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1037229 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1037229' 00:26:28.252 killing process with pid 1037229 00:26:28.252 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@969 -- # kill 1037229 00:26:28.252 Received shutdown signal, test time was about 60.000000 seconds 00:26:28.252 00:26:28.252 Latency(us) 00:26:28.252 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:28.252 =================================================================================================================== 00:26:28.253 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:28.253 [2024-07-25 13:35:08.840946] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:28.253 [2024-07-25 13:35:08.841008] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:28.253 [2024-07-25 13:35:08.841041] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:28.253 [2024-07-25 13:35:08.841049] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x7e7f80 name raid_bdev1, state offline 00:26:28.253 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@974 -- # wait 1037229 00:26:28.253 [2024-07-25 13:35:08.855810] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:28.253 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@800 -- # return 0 00:26:28.253 00:26:28.253 real 0m29.758s 00:26:28.253 user 0m47.434s 00:26:28.253 sys 0m3.681s 00:26:28.253 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:28.253 13:35:08 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:28.253 ************************************ 00:26:28.253 END TEST raid_rebuild_test_sb_4k 00:26:28.253 ************************************ 00:26:28.253 13:35:09 bdev_raid -- bdev/bdev_raid.sh@984 -- # base_malloc_params='-m 32' 00:26:28.253 13:35:09 bdev_raid -- bdev/bdev_raid.sh@985 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:26:28.253 13:35:09 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:26:28.253 13:35:09 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:28.253 13:35:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:28.514 ************************************ 00:26:28.514 START TEST raid_state_function_test_sb_md_separate 00:26:28.514 ************************************ 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=1042572 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1042572' 00:26:28.514 Process raid pid: 1042572 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 1042572 /var/tmp/spdk-raid.sock 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1042572 ']' 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:28.514 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:28.514 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:28.514 [2024-07-25 13:35:09.112820] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:26:28.514 [2024-07-25 13:35:09.112868] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:28.514 [2024-07-25 13:35:09.202676] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:28.514 [2024-07-25 13:35:09.267142] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:28.774 [2024-07-25 13:35:09.308561] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:28.775 [2024-07-25 13:35:09.308583] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:29.345 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:29.345 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:26:29.345 13:35:09 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:29.345 [2024-07-25 13:35:10.123764] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:29.345 [2024-07-25 13:35:10.123797] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:29.345 [2024-07-25 13:35:10.123803] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:29.345 [2024-07-25 13:35:10.123810] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.605 "name": "Existed_Raid", 00:26:29.605 "uuid": "1adda679-441b-493c-9f85-a191cc6f630f", 00:26:29.605 "strip_size_kb": 0, 00:26:29.605 "state": "configuring", 00:26:29.605 "raid_level": "raid1", 00:26:29.605 "superblock": true, 00:26:29.605 "num_base_bdevs": 2, 00:26:29.605 "num_base_bdevs_discovered": 0, 00:26:29.605 "num_base_bdevs_operational": 2, 00:26:29.605 "base_bdevs_list": [ 00:26:29.605 { 00:26:29.605 "name": "BaseBdev1", 00:26:29.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.605 "is_configured": false, 00:26:29.605 "data_offset": 0, 00:26:29.605 "data_size": 0 00:26:29.605 }, 00:26:29.605 { 00:26:29.605 "name": "BaseBdev2", 00:26:29.605 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.605 "is_configured": false, 00:26:29.605 "data_offset": 0, 00:26:29.605 "data_size": 0 00:26:29.605 } 00:26:29.605 ] 00:26:29.605 }' 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.605 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:30.175 13:35:10 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:30.435 [2024-07-25 13:35:11.062006] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:30.435 [2024-07-25 13:35:11.062024] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217f6b0 name Existed_Raid, state configuring 00:26:30.435 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:30.695 [2024-07-25 13:35:11.250505] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:30.695 [2024-07-25 13:35:11.250526] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:30.695 [2024-07-25 13:35:11.250531] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:30.695 [2024-07-25 13:35:11.250541] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:26:30.695 [2024-07-25 13:35:11.417910] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:30.695 BaseBdev1 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:26:30.695 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:30.955 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:31.215 [ 00:26:31.215 { 00:26:31.215 "name": "BaseBdev1", 00:26:31.215 "aliases": [ 00:26:31.215 "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309" 00:26:31.215 ], 00:26:31.215 "product_name": "Malloc disk", 00:26:31.215 "block_size": 4096, 00:26:31.215 "num_blocks": 8192, 00:26:31.215 "uuid": "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309", 00:26:31.215 "md_size": 32, 00:26:31.215 "md_interleave": false, 00:26:31.215 "dif_type": 0, 00:26:31.215 "assigned_rate_limits": { 00:26:31.215 "rw_ios_per_sec": 0, 00:26:31.215 "rw_mbytes_per_sec": 0, 00:26:31.215 "r_mbytes_per_sec": 0, 00:26:31.215 "w_mbytes_per_sec": 0 00:26:31.215 }, 00:26:31.215 "claimed": true, 00:26:31.215 "claim_type": "exclusive_write", 00:26:31.215 "zoned": false, 00:26:31.215 "supported_io_types": { 00:26:31.215 "read": true, 00:26:31.215 "write": true, 00:26:31.215 "unmap": true, 00:26:31.215 "flush": true, 00:26:31.215 "reset": true, 00:26:31.215 "nvme_admin": false, 00:26:31.215 "nvme_io": false, 00:26:31.215 "nvme_io_md": false, 00:26:31.215 "write_zeroes": true, 00:26:31.215 "zcopy": true, 00:26:31.215 "get_zone_info": false, 00:26:31.215 "zone_management": false, 00:26:31.215 "zone_append": false, 00:26:31.215 "compare": false, 00:26:31.215 "compare_and_write": false, 00:26:31.215 "abort": true, 00:26:31.215 "seek_hole": false, 00:26:31.215 "seek_data": false, 00:26:31.215 "copy": true, 00:26:31.215 "nvme_iov_md": false 00:26:31.215 }, 00:26:31.215 "memory_domains": [ 00:26:31.215 { 00:26:31.215 "dma_device_id": "system", 00:26:31.215 "dma_device_type": 1 00:26:31.215 }, 00:26:31.215 { 00:26:31.215 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:31.215 "dma_device_type": 2 00:26:31.215 } 00:26:31.215 ], 00:26:31.215 "driver_specific": {} 00:26:31.215 } 00:26:31.215 ] 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.215 13:35:11 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:31.474 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.474 "name": "Existed_Raid", 00:26:31.474 "uuid": "40b184d1-9b6b-40de-a41f-02cf4d2c3b8c", 00:26:31.474 "strip_size_kb": 0, 00:26:31.474 "state": "configuring", 00:26:31.474 "raid_level": "raid1", 00:26:31.474 "superblock": true, 00:26:31.474 "num_base_bdevs": 2, 00:26:31.474 "num_base_bdevs_discovered": 1, 00:26:31.474 "num_base_bdevs_operational": 2, 00:26:31.474 "base_bdevs_list": [ 00:26:31.474 { 00:26:31.474 "name": "BaseBdev1", 00:26:31.474 "uuid": "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309", 00:26:31.474 "is_configured": true, 00:26:31.474 "data_offset": 256, 00:26:31.474 "data_size": 7936 00:26:31.474 }, 00:26:31.474 { 00:26:31.474 "name": "BaseBdev2", 00:26:31.474 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.474 "is_configured": false, 00:26:31.474 "data_offset": 0, 00:26:31.474 "data_size": 0 00:26:31.474 } 00:26:31.474 ] 00:26:31.474 }' 00:26:31.474 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.474 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:32.043 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:32.043 [2024-07-25 13:35:12.709178] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:32.043 [2024-07-25 13:35:12.709202] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x217efa0 name Existed_Raid, state configuring 00:26:32.043 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:32.302 [2024-07-25 13:35:12.869613] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:32.302 [2024-07-25 13:35:12.870835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:32.302 [2024-07-25 13:35:12.870858] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.302 13:35:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:32.302 13:35:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:32.302 "name": "Existed_Raid", 00:26:32.302 "uuid": "5b6be0fd-3181-4b50-9043-fd41009170ee", 00:26:32.302 "strip_size_kb": 0, 00:26:32.302 "state": "configuring", 00:26:32.302 "raid_level": "raid1", 00:26:32.302 "superblock": true, 00:26:32.302 "num_base_bdevs": 2, 00:26:32.303 "num_base_bdevs_discovered": 1, 00:26:32.303 "num_base_bdevs_operational": 2, 00:26:32.303 "base_bdevs_list": [ 00:26:32.303 { 00:26:32.303 "name": "BaseBdev1", 00:26:32.303 "uuid": "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309", 00:26:32.303 "is_configured": true, 00:26:32.303 "data_offset": 256, 00:26:32.303 "data_size": 7936 00:26:32.303 }, 00:26:32.303 { 00:26:32.303 "name": "BaseBdev2", 00:26:32.303 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:32.303 "is_configured": false, 00:26:32.303 "data_offset": 0, 00:26:32.303 "data_size": 0 00:26:32.303 } 00:26:32.303 ] 00:26:32.303 }' 00:26:32.303 13:35:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:32.303 13:35:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:33.241 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:26:33.500 [2024-07-25 13:35:14.186311] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:33.500 [2024-07-25 13:35:14.186413] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x231c810 00:26:33.500 [2024-07-25 13:35:14.186421] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:33.500 [2024-07-25 13:35:14.186463] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217f550 00:26:33.500 [2024-07-25 13:35:14.186534] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x231c810 00:26:33.500 [2024-07-25 13:35:14.186539] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x231c810 00:26:33.500 [2024-07-25 13:35:14.186593] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.500 BaseBdev2 00:26:33.500 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:33.500 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:26:33.500 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:26:33.500 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # local i 00:26:33.500 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:26:33.500 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:26:33.500 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:33.760 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:34.020 [ 00:26:34.020 { 00:26:34.020 "name": "BaseBdev2", 00:26:34.020 "aliases": [ 00:26:34.020 "5ba2ee20-2cf0-4db6-b5ac-610815a16277" 00:26:34.020 ], 00:26:34.020 "product_name": "Malloc disk", 00:26:34.020 "block_size": 4096, 00:26:34.020 "num_blocks": 8192, 00:26:34.020 "uuid": "5ba2ee20-2cf0-4db6-b5ac-610815a16277", 00:26:34.020 "md_size": 32, 00:26:34.020 "md_interleave": false, 00:26:34.020 "dif_type": 0, 00:26:34.020 "assigned_rate_limits": { 00:26:34.020 "rw_ios_per_sec": 0, 00:26:34.020 "rw_mbytes_per_sec": 0, 00:26:34.020 "r_mbytes_per_sec": 0, 00:26:34.020 "w_mbytes_per_sec": 0 00:26:34.020 }, 00:26:34.020 "claimed": true, 00:26:34.020 "claim_type": "exclusive_write", 00:26:34.020 "zoned": false, 00:26:34.020 "supported_io_types": { 00:26:34.020 "read": true, 00:26:34.020 "write": true, 00:26:34.020 "unmap": true, 00:26:34.020 "flush": true, 00:26:34.020 "reset": true, 00:26:34.020 "nvme_admin": false, 00:26:34.020 "nvme_io": false, 00:26:34.020 "nvme_io_md": false, 00:26:34.020 "write_zeroes": true, 00:26:34.020 "zcopy": true, 00:26:34.020 "get_zone_info": false, 00:26:34.020 "zone_management": false, 00:26:34.020 "zone_append": false, 00:26:34.020 "compare": false, 00:26:34.020 "compare_and_write": false, 00:26:34.020 "abort": true, 00:26:34.020 "seek_hole": false, 00:26:34.020 "seek_data": false, 00:26:34.020 "copy": true, 00:26:34.020 "nvme_iov_md": false 00:26:34.020 }, 00:26:34.020 "memory_domains": [ 00:26:34.020 { 00:26:34.020 "dma_device_id": "system", 00:26:34.020 "dma_device_type": 1 00:26:34.020 }, 00:26:34.020 { 00:26:34.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:34.020 "dma_device_type": 2 00:26:34.020 } 00:26:34.020 ], 00:26:34.020 "driver_specific": {} 00:26:34.020 } 00:26:34.020 ] 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@907 -- # return 0 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:34.020 "name": "Existed_Raid", 00:26:34.020 "uuid": "5b6be0fd-3181-4b50-9043-fd41009170ee", 00:26:34.020 "strip_size_kb": 0, 00:26:34.020 "state": "online", 00:26:34.020 "raid_level": "raid1", 00:26:34.020 "superblock": true, 00:26:34.020 "num_base_bdevs": 2, 00:26:34.020 "num_base_bdevs_discovered": 2, 00:26:34.020 "num_base_bdevs_operational": 2, 00:26:34.020 "base_bdevs_list": [ 00:26:34.020 { 00:26:34.020 "name": "BaseBdev1", 00:26:34.020 "uuid": "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309", 00:26:34.020 "is_configured": true, 00:26:34.020 "data_offset": 256, 00:26:34.020 "data_size": 7936 00:26:34.020 }, 00:26:34.020 { 00:26:34.020 "name": "BaseBdev2", 00:26:34.020 "uuid": "5ba2ee20-2cf0-4db6-b5ac-610815a16277", 00:26:34.020 "is_configured": true, 00:26:34.020 "data_offset": 256, 00:26:34.020 "data_size": 7936 00:26:34.020 } 00:26:34.020 ] 00:26:34.020 }' 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:34.020 13:35:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:34.959 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:35.220 [2024-07-25 13:35:15.858782] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:35.220 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:35.220 "name": "Existed_Raid", 00:26:35.220 "aliases": [ 00:26:35.220 "5b6be0fd-3181-4b50-9043-fd41009170ee" 00:26:35.220 ], 00:26:35.220 "product_name": "Raid Volume", 00:26:35.220 "block_size": 4096, 00:26:35.220 "num_blocks": 7936, 00:26:35.220 "uuid": "5b6be0fd-3181-4b50-9043-fd41009170ee", 00:26:35.220 "md_size": 32, 00:26:35.220 "md_interleave": false, 00:26:35.220 "dif_type": 0, 00:26:35.220 "assigned_rate_limits": { 00:26:35.220 "rw_ios_per_sec": 0, 00:26:35.220 "rw_mbytes_per_sec": 0, 00:26:35.220 "r_mbytes_per_sec": 0, 00:26:35.220 "w_mbytes_per_sec": 0 00:26:35.220 }, 00:26:35.220 "claimed": false, 00:26:35.220 "zoned": false, 00:26:35.220 "supported_io_types": { 00:26:35.220 "read": true, 00:26:35.220 "write": true, 00:26:35.220 "unmap": false, 00:26:35.220 "flush": false, 00:26:35.220 "reset": true, 00:26:35.220 "nvme_admin": false, 00:26:35.220 "nvme_io": false, 00:26:35.220 "nvme_io_md": false, 00:26:35.220 "write_zeroes": true, 00:26:35.220 "zcopy": false, 00:26:35.220 "get_zone_info": false, 00:26:35.220 "zone_management": false, 00:26:35.220 "zone_append": false, 00:26:35.220 "compare": false, 00:26:35.220 "compare_and_write": false, 00:26:35.220 "abort": false, 00:26:35.220 "seek_hole": false, 00:26:35.220 "seek_data": false, 00:26:35.220 "copy": false, 00:26:35.220 "nvme_iov_md": false 00:26:35.220 }, 00:26:35.220 "memory_domains": [ 00:26:35.220 { 00:26:35.220 "dma_device_id": "system", 00:26:35.220 "dma_device_type": 1 00:26:35.220 }, 00:26:35.220 { 00:26:35.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.220 "dma_device_type": 2 00:26:35.220 }, 00:26:35.220 { 00:26:35.220 "dma_device_id": "system", 00:26:35.220 "dma_device_type": 1 00:26:35.220 }, 00:26:35.220 { 00:26:35.220 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.220 "dma_device_type": 2 00:26:35.220 } 00:26:35.220 ], 00:26:35.220 "driver_specific": { 00:26:35.220 "raid": { 00:26:35.220 "uuid": "5b6be0fd-3181-4b50-9043-fd41009170ee", 00:26:35.220 "strip_size_kb": 0, 00:26:35.220 "state": "online", 00:26:35.220 "raid_level": "raid1", 00:26:35.220 "superblock": true, 00:26:35.220 "num_base_bdevs": 2, 00:26:35.220 "num_base_bdevs_discovered": 2, 00:26:35.220 "num_base_bdevs_operational": 2, 00:26:35.220 "base_bdevs_list": [ 00:26:35.220 { 00:26:35.220 "name": "BaseBdev1", 00:26:35.220 "uuid": "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309", 00:26:35.220 "is_configured": true, 00:26:35.220 "data_offset": 256, 00:26:35.220 "data_size": 7936 00:26:35.220 }, 00:26:35.220 { 00:26:35.220 "name": "BaseBdev2", 00:26:35.220 "uuid": "5ba2ee20-2cf0-4db6-b5ac-610815a16277", 00:26:35.220 "is_configured": true, 00:26:35.220 "data_offset": 256, 00:26:35.220 "data_size": 7936 00:26:35.220 } 00:26:35.220 ] 00:26:35.220 } 00:26:35.220 } 00:26:35.220 }' 00:26:35.220 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:35.220 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:35.220 BaseBdev2' 00:26:35.220 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:35.220 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:35.220 13:35:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:35.479 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:35.479 "name": "BaseBdev1", 00:26:35.479 "aliases": [ 00:26:35.479 "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309" 00:26:35.479 ], 00:26:35.479 "product_name": "Malloc disk", 00:26:35.479 "block_size": 4096, 00:26:35.479 "num_blocks": 8192, 00:26:35.479 "uuid": "a19ce57f-91ef-4c39-a4c0-96f4f0bcd309", 00:26:35.479 "md_size": 32, 00:26:35.479 "md_interleave": false, 00:26:35.479 "dif_type": 0, 00:26:35.479 "assigned_rate_limits": { 00:26:35.479 "rw_ios_per_sec": 0, 00:26:35.479 "rw_mbytes_per_sec": 0, 00:26:35.479 "r_mbytes_per_sec": 0, 00:26:35.479 "w_mbytes_per_sec": 0 00:26:35.479 }, 00:26:35.479 "claimed": true, 00:26:35.479 "claim_type": "exclusive_write", 00:26:35.479 "zoned": false, 00:26:35.479 "supported_io_types": { 00:26:35.479 "read": true, 00:26:35.479 "write": true, 00:26:35.479 "unmap": true, 00:26:35.479 "flush": true, 00:26:35.479 "reset": true, 00:26:35.479 "nvme_admin": false, 00:26:35.479 "nvme_io": false, 00:26:35.479 "nvme_io_md": false, 00:26:35.479 "write_zeroes": true, 00:26:35.479 "zcopy": true, 00:26:35.479 "get_zone_info": false, 00:26:35.479 "zone_management": false, 00:26:35.479 "zone_append": false, 00:26:35.479 "compare": false, 00:26:35.479 "compare_and_write": false, 00:26:35.479 "abort": true, 00:26:35.479 "seek_hole": false, 00:26:35.479 "seek_data": false, 00:26:35.479 "copy": true, 00:26:35.479 "nvme_iov_md": false 00:26:35.479 }, 00:26:35.479 "memory_domains": [ 00:26:35.479 { 00:26:35.479 "dma_device_id": "system", 00:26:35.479 "dma_device_type": 1 00:26:35.479 }, 00:26:35.479 { 00:26:35.479 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.479 "dma_device_type": 2 00:26:35.479 } 00:26:35.479 ], 00:26:35.479 "driver_specific": {} 00:26:35.479 }' 00:26:35.479 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:35.479 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:35.479 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:35.479 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:35.479 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:35.738 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:35.997 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:35.997 "name": "BaseBdev2", 00:26:35.997 "aliases": [ 00:26:35.997 "5ba2ee20-2cf0-4db6-b5ac-610815a16277" 00:26:35.997 ], 00:26:35.997 "product_name": "Malloc disk", 00:26:35.997 "block_size": 4096, 00:26:35.998 "num_blocks": 8192, 00:26:35.998 "uuid": "5ba2ee20-2cf0-4db6-b5ac-610815a16277", 00:26:35.998 "md_size": 32, 00:26:35.998 "md_interleave": false, 00:26:35.998 "dif_type": 0, 00:26:35.998 "assigned_rate_limits": { 00:26:35.998 "rw_ios_per_sec": 0, 00:26:35.998 "rw_mbytes_per_sec": 0, 00:26:35.998 "r_mbytes_per_sec": 0, 00:26:35.998 "w_mbytes_per_sec": 0 00:26:35.998 }, 00:26:35.998 "claimed": true, 00:26:35.998 "claim_type": "exclusive_write", 00:26:35.998 "zoned": false, 00:26:35.998 "supported_io_types": { 00:26:35.998 "read": true, 00:26:35.998 "write": true, 00:26:35.998 "unmap": true, 00:26:35.998 "flush": true, 00:26:35.998 "reset": true, 00:26:35.998 "nvme_admin": false, 00:26:35.998 "nvme_io": false, 00:26:35.998 "nvme_io_md": false, 00:26:35.998 "write_zeroes": true, 00:26:35.998 "zcopy": true, 00:26:35.998 "get_zone_info": false, 00:26:35.998 "zone_management": false, 00:26:35.998 "zone_append": false, 00:26:35.998 "compare": false, 00:26:35.998 "compare_and_write": false, 00:26:35.998 "abort": true, 00:26:35.998 "seek_hole": false, 00:26:35.998 "seek_data": false, 00:26:35.998 "copy": true, 00:26:35.998 "nvme_iov_md": false 00:26:35.998 }, 00:26:35.998 "memory_domains": [ 00:26:35.998 { 00:26:35.998 "dma_device_id": "system", 00:26:35.998 "dma_device_type": 1 00:26:35.998 }, 00:26:35.998 { 00:26:35.998 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:35.998 "dma_device_type": 2 00:26:35.998 } 00:26:35.998 ], 00:26:35.998 "driver_specific": {} 00:26:35.998 }' 00:26:35.998 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:35.998 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:36.257 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:36.257 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.257 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:36.257 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:36.257 13:35:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:36.257 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:36.518 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:36.518 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:36.518 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:36.518 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:36.518 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:36.777 [2024-07-25 13:35:17.502743] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:36.777 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:36.778 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:36.778 13:35:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:37.347 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:37.347 "name": "Existed_Raid", 00:26:37.347 "uuid": "5b6be0fd-3181-4b50-9043-fd41009170ee", 00:26:37.347 "strip_size_kb": 0, 00:26:37.347 "state": "online", 00:26:37.347 "raid_level": "raid1", 00:26:37.347 "superblock": true, 00:26:37.347 "num_base_bdevs": 2, 00:26:37.347 "num_base_bdevs_discovered": 1, 00:26:37.347 "num_base_bdevs_operational": 1, 00:26:37.347 "base_bdevs_list": [ 00:26:37.347 { 00:26:37.347 "name": null, 00:26:37.347 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:37.347 "is_configured": false, 00:26:37.347 "data_offset": 256, 00:26:37.347 "data_size": 7936 00:26:37.347 }, 00:26:37.347 { 00:26:37.347 "name": "BaseBdev2", 00:26:37.347 "uuid": "5ba2ee20-2cf0-4db6-b5ac-610815a16277", 00:26:37.347 "is_configured": true, 00:26:37.347 "data_offset": 256, 00:26:37.347 "data_size": 7936 00:26:37.347 } 00:26:37.347 ] 00:26:37.347 }' 00:26:37.347 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:37.347 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:37.916 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:37.916 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:37.916 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.916 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:38.176 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:38.176 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:38.176 13:35:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:38.746 [2024-07-25 13:35:19.365451] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:38.746 [2024-07-25 13:35:19.365515] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:38.746 [2024-07-25 13:35:19.371881] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:38.746 [2024-07-25 13:35:19.371906] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:38.746 [2024-07-25 13:35:19.371912] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x231c810 name Existed_Raid, state offline 00:26:38.746 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:38.746 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:38.746 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:38.746 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 1042572 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1042572 ']' 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1042572 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:39.315 13:35:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1042572 00:26:39.315 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:39.315 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:39.315 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1042572' 00:26:39.315 killing process with pid 1042572 00:26:39.315 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1042572 00:26:39.315 [2024-07-25 13:35:20.005871] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:39.315 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1042572 00:26:39.315 [2024-07-25 13:35:20.006501] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:39.575 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:26:39.575 00:26:39.575 real 0m11.076s 00:26:39.575 user 0m20.333s 00:26:39.575 sys 0m1.577s 00:26:39.575 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:39.575 13:35:20 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:39.575 ************************************ 00:26:39.575 END TEST raid_state_function_test_sb_md_separate 00:26:39.575 ************************************ 00:26:39.575 13:35:20 bdev_raid -- bdev/bdev_raid.sh@986 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:26:39.575 13:35:20 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:26:39.575 13:35:20 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:39.575 13:35:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:39.575 ************************************ 00:26:39.575 START TEST raid_superblock_test_md_separate 00:26:39.575 ************************************ 00:26:39.575 13:35:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:26:39.575 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:26:39.575 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@414 -- # local strip_size 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@427 -- # raid_pid=1044625 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@428 -- # waitforlisten 1044625 /var/tmp/spdk-raid.sock 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1044625 ']' 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:39.576 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:39.576 13:35:20 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:39.576 [2024-07-25 13:35:20.261039] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:26:39.576 [2024-07-25 13:35:20.261090] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1044625 ] 00:26:39.576 [2024-07-25 13:35:20.351273] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:39.836 [2024-07-25 13:35:20.419249] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:39.836 [2024-07-25 13:35:20.461747] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:39.836 [2024-07-25 13:35:20.461774] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@864 -- # return 0 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:40.776 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:26:41.036 malloc1 00:26:41.036 13:35:21 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:41.605 [2024-07-25 13:35:22.230441] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:41.605 [2024-07-25 13:35:22.230477] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:41.605 [2024-07-25 13:35:22.230490] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277ec70 00:26:41.605 [2024-07-25 13:35:22.230496] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:41.605 [2024-07-25 13:35:22.231689] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:41.605 [2024-07-25 13:35:22.231715] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:41.605 pt1 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:41.605 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:26:41.865 malloc2 00:26:41.865 13:35:22 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:42.435 [2024-07-25 13:35:23.043020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:42.435 [2024-07-25 13:35:23.043051] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:42.435 [2024-07-25 13:35:23.043061] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2770690 00:26:42.435 [2024-07-25 13:35:23.043072] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:42.435 [2024-07-25 13:35:23.044181] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:42.435 [2024-07-25 13:35:23.044199] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:42.435 pt2 00:26:42.435 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:26:42.435 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:26:42.435 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:42.695 [2024-07-25 13:35:23.319723] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:42.695 [2024-07-25 13:35:23.320762] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:42.695 [2024-07-25 13:35:23.320860] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2771160 00:26:42.695 [2024-07-25 13:35:23.320868] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:42.695 [2024-07-25 13:35:23.320922] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2766650 00:26:42.695 [2024-07-25 13:35:23.321008] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2771160 00:26:42.695 [2024-07-25 13:35:23.321013] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2771160 00:26:42.695 [2024-07-25 13:35:23.321073] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:42.695 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.264 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:43.264 "name": "raid_bdev1", 00:26:43.264 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:43.264 "strip_size_kb": 0, 00:26:43.264 "state": "online", 00:26:43.264 "raid_level": "raid1", 00:26:43.264 "superblock": true, 00:26:43.264 "num_base_bdevs": 2, 00:26:43.264 "num_base_bdevs_discovered": 2, 00:26:43.264 "num_base_bdevs_operational": 2, 00:26:43.264 "base_bdevs_list": [ 00:26:43.264 { 00:26:43.264 "name": "pt1", 00:26:43.264 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:43.264 "is_configured": true, 00:26:43.264 "data_offset": 256, 00:26:43.264 "data_size": 7936 00:26:43.264 }, 00:26:43.264 { 00:26:43.264 "name": "pt2", 00:26:43.264 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:43.264 "is_configured": true, 00:26:43.264 "data_offset": 256, 00:26:43.264 "data_size": 7936 00:26:43.264 } 00:26:43.264 ] 00:26:43.264 }' 00:26:43.264 13:35:23 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:43.264 13:35:23 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:43.833 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:26:43.833 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:43.833 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:43.833 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:43.834 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:43.834 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:43.834 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:43.834 13:35:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:44.403 [2024-07-25 13:35:25.036258] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:44.403 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:44.403 "name": "raid_bdev1", 00:26:44.403 "aliases": [ 00:26:44.403 "6fc26fb7-d700-4a0f-919b-9d41a419c1c4" 00:26:44.403 ], 00:26:44.403 "product_name": "Raid Volume", 00:26:44.403 "block_size": 4096, 00:26:44.403 "num_blocks": 7936, 00:26:44.403 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:44.403 "md_size": 32, 00:26:44.403 "md_interleave": false, 00:26:44.403 "dif_type": 0, 00:26:44.403 "assigned_rate_limits": { 00:26:44.403 "rw_ios_per_sec": 0, 00:26:44.403 "rw_mbytes_per_sec": 0, 00:26:44.403 "r_mbytes_per_sec": 0, 00:26:44.403 "w_mbytes_per_sec": 0 00:26:44.403 }, 00:26:44.403 "claimed": false, 00:26:44.403 "zoned": false, 00:26:44.403 "supported_io_types": { 00:26:44.403 "read": true, 00:26:44.403 "write": true, 00:26:44.403 "unmap": false, 00:26:44.403 "flush": false, 00:26:44.403 "reset": true, 00:26:44.403 "nvme_admin": false, 00:26:44.403 "nvme_io": false, 00:26:44.403 "nvme_io_md": false, 00:26:44.403 "write_zeroes": true, 00:26:44.403 "zcopy": false, 00:26:44.403 "get_zone_info": false, 00:26:44.403 "zone_management": false, 00:26:44.403 "zone_append": false, 00:26:44.403 "compare": false, 00:26:44.403 "compare_and_write": false, 00:26:44.403 "abort": false, 00:26:44.403 "seek_hole": false, 00:26:44.403 "seek_data": false, 00:26:44.403 "copy": false, 00:26:44.403 "nvme_iov_md": false 00:26:44.403 }, 00:26:44.403 "memory_domains": [ 00:26:44.403 { 00:26:44.403 "dma_device_id": "system", 00:26:44.403 "dma_device_type": 1 00:26:44.403 }, 00:26:44.403 { 00:26:44.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:44.403 "dma_device_type": 2 00:26:44.403 }, 00:26:44.403 { 00:26:44.403 "dma_device_id": "system", 00:26:44.403 "dma_device_type": 1 00:26:44.403 }, 00:26:44.403 { 00:26:44.403 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:44.403 "dma_device_type": 2 00:26:44.403 } 00:26:44.403 ], 00:26:44.403 "driver_specific": { 00:26:44.403 "raid": { 00:26:44.403 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:44.403 "strip_size_kb": 0, 00:26:44.403 "state": "online", 00:26:44.403 "raid_level": "raid1", 00:26:44.403 "superblock": true, 00:26:44.403 "num_base_bdevs": 2, 00:26:44.403 "num_base_bdevs_discovered": 2, 00:26:44.403 "num_base_bdevs_operational": 2, 00:26:44.403 "base_bdevs_list": [ 00:26:44.403 { 00:26:44.403 "name": "pt1", 00:26:44.403 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:44.403 "is_configured": true, 00:26:44.403 "data_offset": 256, 00:26:44.403 "data_size": 7936 00:26:44.403 }, 00:26:44.403 { 00:26:44.403 "name": "pt2", 00:26:44.403 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:44.403 "is_configured": true, 00:26:44.403 "data_offset": 256, 00:26:44.403 "data_size": 7936 00:26:44.403 } 00:26:44.403 ] 00:26:44.403 } 00:26:44.403 } 00:26:44.403 }' 00:26:44.403 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:44.403 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:44.403 pt2' 00:26:44.403 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:44.403 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:44.403 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:44.972 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:44.972 "name": "pt1", 00:26:44.972 "aliases": [ 00:26:44.972 "00000000-0000-0000-0000-000000000001" 00:26:44.972 ], 00:26:44.972 "product_name": "passthru", 00:26:44.972 "block_size": 4096, 00:26:44.972 "num_blocks": 8192, 00:26:44.972 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:44.972 "md_size": 32, 00:26:44.972 "md_interleave": false, 00:26:44.972 "dif_type": 0, 00:26:44.972 "assigned_rate_limits": { 00:26:44.972 "rw_ios_per_sec": 0, 00:26:44.972 "rw_mbytes_per_sec": 0, 00:26:44.972 "r_mbytes_per_sec": 0, 00:26:44.972 "w_mbytes_per_sec": 0 00:26:44.972 }, 00:26:44.972 "claimed": true, 00:26:44.972 "claim_type": "exclusive_write", 00:26:44.972 "zoned": false, 00:26:44.972 "supported_io_types": { 00:26:44.972 "read": true, 00:26:44.972 "write": true, 00:26:44.972 "unmap": true, 00:26:44.972 "flush": true, 00:26:44.972 "reset": true, 00:26:44.972 "nvme_admin": false, 00:26:44.972 "nvme_io": false, 00:26:44.972 "nvme_io_md": false, 00:26:44.972 "write_zeroes": true, 00:26:44.972 "zcopy": true, 00:26:44.972 "get_zone_info": false, 00:26:44.972 "zone_management": false, 00:26:44.972 "zone_append": false, 00:26:44.972 "compare": false, 00:26:44.972 "compare_and_write": false, 00:26:44.972 "abort": true, 00:26:44.972 "seek_hole": false, 00:26:44.972 "seek_data": false, 00:26:44.972 "copy": true, 00:26:44.972 "nvme_iov_md": false 00:26:44.972 }, 00:26:44.972 "memory_domains": [ 00:26:44.972 { 00:26:44.972 "dma_device_id": "system", 00:26:44.972 "dma_device_type": 1 00:26:44.972 }, 00:26:44.972 { 00:26:44.972 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:44.972 "dma_device_type": 2 00:26:44.972 } 00:26:44.972 ], 00:26:44.972 "driver_specific": { 00:26:44.972 "passthru": { 00:26:44.972 "name": "pt1", 00:26:44.972 "base_bdev_name": "malloc1" 00:26:44.972 } 00:26:44.972 } 00:26:44.972 }' 00:26:44.972 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:45.231 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:45.231 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:45.231 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:45.231 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:45.231 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:45.231 13:35:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:45.491 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:46.059 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:46.059 "name": "pt2", 00:26:46.059 "aliases": [ 00:26:46.059 "00000000-0000-0000-0000-000000000002" 00:26:46.059 ], 00:26:46.059 "product_name": "passthru", 00:26:46.059 "block_size": 4096, 00:26:46.059 "num_blocks": 8192, 00:26:46.059 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:46.059 "md_size": 32, 00:26:46.059 "md_interleave": false, 00:26:46.059 "dif_type": 0, 00:26:46.059 "assigned_rate_limits": { 00:26:46.059 "rw_ios_per_sec": 0, 00:26:46.059 "rw_mbytes_per_sec": 0, 00:26:46.059 "r_mbytes_per_sec": 0, 00:26:46.059 "w_mbytes_per_sec": 0 00:26:46.059 }, 00:26:46.059 "claimed": true, 00:26:46.059 "claim_type": "exclusive_write", 00:26:46.059 "zoned": false, 00:26:46.059 "supported_io_types": { 00:26:46.059 "read": true, 00:26:46.059 "write": true, 00:26:46.059 "unmap": true, 00:26:46.059 "flush": true, 00:26:46.059 "reset": true, 00:26:46.059 "nvme_admin": false, 00:26:46.060 "nvme_io": false, 00:26:46.060 "nvme_io_md": false, 00:26:46.060 "write_zeroes": true, 00:26:46.060 "zcopy": true, 00:26:46.060 "get_zone_info": false, 00:26:46.060 "zone_management": false, 00:26:46.060 "zone_append": false, 00:26:46.060 "compare": false, 00:26:46.060 "compare_and_write": false, 00:26:46.060 "abort": true, 00:26:46.060 "seek_hole": false, 00:26:46.060 "seek_data": false, 00:26:46.060 "copy": true, 00:26:46.060 "nvme_iov_md": false 00:26:46.060 }, 00:26:46.060 "memory_domains": [ 00:26:46.060 { 00:26:46.060 "dma_device_id": "system", 00:26:46.060 "dma_device_type": 1 00:26:46.060 }, 00:26:46.060 { 00:26:46.060 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:46.060 "dma_device_type": 2 00:26:46.060 } 00:26:46.060 ], 00:26:46.060 "driver_specific": { 00:26:46.060 "passthru": { 00:26:46.060 "name": "pt2", 00:26:46.060 "base_bdev_name": "malloc2" 00:26:46.060 } 00:26:46.060 } 00:26:46.060 }' 00:26:46.060 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:46.320 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:46.320 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:46.320 13:35:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:46.320 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:46.320 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:46.320 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:46.579 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:46.579 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:46.579 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:46.579 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:46.838 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:46.838 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:46.838 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:26:47.407 [2024-07-25 13:35:27.895514] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:47.407 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=6fc26fb7-d700-4a0f-919b-9d41a419c1c4 00:26:47.407 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@451 -- # '[' -z 6fc26fb7-d700-4a0f-919b-9d41a419c1c4 ']' 00:26:47.407 13:35:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:47.407 [2024-07-25 13:35:28.107844] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:47.407 [2024-07-25 13:35:28.107856] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:47.407 [2024-07-25 13:35:28.107893] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:47.407 [2024-07-25 13:35:28.107931] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:47.407 [2024-07-25 13:35:28.107937] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2771160 name raid_bdev1, state offline 00:26:47.407 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.407 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:26:47.976 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:26:47.976 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:26:47.976 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:26:47.976 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:48.235 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:26:48.235 13:35:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:48.803 13:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:48.803 13:35:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:49.451 [2024-07-25 13:35:30.217079] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:49.451 [2024-07-25 13:35:30.218143] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:49.451 [2024-07-25 13:35:30.218187] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:49.451 [2024-07-25 13:35:30.218216] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:49.451 [2024-07-25 13:35:30.218227] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:49.451 [2024-07-25 13:35:30.218233] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2763bc0 name raid_bdev1, state configuring 00:26:49.451 request: 00:26:49.451 { 00:26:49.451 "name": "raid_bdev1", 00:26:49.451 "raid_level": "raid1", 00:26:49.451 "base_bdevs": [ 00:26:49.451 "malloc1", 00:26:49.451 "malloc2" 00:26:49.451 ], 00:26:49.451 "superblock": false, 00:26:49.451 "method": "bdev_raid_create", 00:26:49.451 "req_id": 1 00:26:49.451 } 00:26:49.451 Got JSON-RPC error response 00:26:49.451 response: 00:26:49.451 { 00:26:49.451 "code": -17, 00:26:49.451 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:49.451 } 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@653 -- # es=1 00:26:49.451 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:26:49.452 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:26:49.452 13:35:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:26:49.728 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.728 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:26:49.728 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:26:49.728 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:26:49.728 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:50.298 [2024-07-25 13:35:30.934820] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:50.298 [2024-07-25 13:35:30.934849] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:50.298 [2024-07-25 13:35:30.934858] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277f320 00:26:50.298 [2024-07-25 13:35:30.934865] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:50.298 [2024-07-25 13:35:30.936008] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:50.298 [2024-07-25 13:35:30.936028] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:50.298 [2024-07-25 13:35:30.936058] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:50.298 [2024-07-25 13:35:30.936075] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:50.298 pt1 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:50.298 13:35:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.558 13:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.558 "name": "raid_bdev1", 00:26:50.558 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:50.558 "strip_size_kb": 0, 00:26:50.558 "state": "configuring", 00:26:50.558 "raid_level": "raid1", 00:26:50.558 "superblock": true, 00:26:50.558 "num_base_bdevs": 2, 00:26:50.558 "num_base_bdevs_discovered": 1, 00:26:50.558 "num_base_bdevs_operational": 2, 00:26:50.558 "base_bdevs_list": [ 00:26:50.558 { 00:26:50.558 "name": "pt1", 00:26:50.558 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:50.558 "is_configured": true, 00:26:50.558 "data_offset": 256, 00:26:50.558 "data_size": 7936 00:26:50.558 }, 00:26:50.558 { 00:26:50.558 "name": null, 00:26:50.558 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:50.558 "is_configured": false, 00:26:50.558 "data_offset": 256, 00:26:50.558 "data_size": 7936 00:26:50.558 } 00:26:50.558 ] 00:26:50.558 }' 00:26:50.558 13:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.558 13:35:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:51.127 13:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:26:51.127 13:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:26:51.127 13:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:26:51.127 13:35:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:51.697 [2024-07-25 13:35:32.238145] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:51.697 [2024-07-25 13:35:32.238178] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:51.697 [2024-07-25 13:35:32.238197] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27708c0 00:26:51.697 [2024-07-25 13:35:32.238204] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:51.697 [2024-07-25 13:35:32.238353] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:51.697 [2024-07-25 13:35:32.238365] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:51.697 [2024-07-25 13:35:32.238396] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:51.697 [2024-07-25 13:35:32.238409] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:51.697 [2024-07-25 13:35:32.238481] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2765010 00:26:51.697 [2024-07-25 13:35:32.238487] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:51.697 [2024-07-25 13:35:32.238525] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25d9dd0 00:26:51.697 [2024-07-25 13:35:32.238612] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2765010 00:26:51.697 [2024-07-25 13:35:32.238618] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2765010 00:26:51.697 [2024-07-25 13:35:32.238670] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:51.697 pt2 00:26:51.697 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:26:51.697 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:26:51.697 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:51.697 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:51.697 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:51.698 "name": "raid_bdev1", 00:26:51.698 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:51.698 "strip_size_kb": 0, 00:26:51.698 "state": "online", 00:26:51.698 "raid_level": "raid1", 00:26:51.698 "superblock": true, 00:26:51.698 "num_base_bdevs": 2, 00:26:51.698 "num_base_bdevs_discovered": 2, 00:26:51.698 "num_base_bdevs_operational": 2, 00:26:51.698 "base_bdevs_list": [ 00:26:51.698 { 00:26:51.698 "name": "pt1", 00:26:51.698 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:51.698 "is_configured": true, 00:26:51.698 "data_offset": 256, 00:26:51.698 "data_size": 7936 00:26:51.698 }, 00:26:51.698 { 00:26:51.698 "name": "pt2", 00:26:51.698 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:51.698 "is_configured": true, 00:26:51.698 "data_offset": 256, 00:26:51.698 "data_size": 7936 00:26:51.698 } 00:26:51.698 ] 00:26:51.698 }' 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:51.698 13:35:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:52.268 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:52.529 [2024-07-25 13:35:33.184747] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:52.529 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:52.529 "name": "raid_bdev1", 00:26:52.529 "aliases": [ 00:26:52.529 "6fc26fb7-d700-4a0f-919b-9d41a419c1c4" 00:26:52.529 ], 00:26:52.529 "product_name": "Raid Volume", 00:26:52.529 "block_size": 4096, 00:26:52.529 "num_blocks": 7936, 00:26:52.529 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:52.529 "md_size": 32, 00:26:52.529 "md_interleave": false, 00:26:52.529 "dif_type": 0, 00:26:52.529 "assigned_rate_limits": { 00:26:52.529 "rw_ios_per_sec": 0, 00:26:52.529 "rw_mbytes_per_sec": 0, 00:26:52.529 "r_mbytes_per_sec": 0, 00:26:52.529 "w_mbytes_per_sec": 0 00:26:52.529 }, 00:26:52.529 "claimed": false, 00:26:52.529 "zoned": false, 00:26:52.529 "supported_io_types": { 00:26:52.529 "read": true, 00:26:52.529 "write": true, 00:26:52.529 "unmap": false, 00:26:52.529 "flush": false, 00:26:52.529 "reset": true, 00:26:52.529 "nvme_admin": false, 00:26:52.529 "nvme_io": false, 00:26:52.529 "nvme_io_md": false, 00:26:52.529 "write_zeroes": true, 00:26:52.529 "zcopy": false, 00:26:52.529 "get_zone_info": false, 00:26:52.529 "zone_management": false, 00:26:52.529 "zone_append": false, 00:26:52.529 "compare": false, 00:26:52.529 "compare_and_write": false, 00:26:52.529 "abort": false, 00:26:52.529 "seek_hole": false, 00:26:52.529 "seek_data": false, 00:26:52.529 "copy": false, 00:26:52.529 "nvme_iov_md": false 00:26:52.529 }, 00:26:52.529 "memory_domains": [ 00:26:52.529 { 00:26:52.529 "dma_device_id": "system", 00:26:52.529 "dma_device_type": 1 00:26:52.529 }, 00:26:52.529 { 00:26:52.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:52.529 "dma_device_type": 2 00:26:52.529 }, 00:26:52.529 { 00:26:52.529 "dma_device_id": "system", 00:26:52.529 "dma_device_type": 1 00:26:52.529 }, 00:26:52.529 { 00:26:52.529 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:52.529 "dma_device_type": 2 00:26:52.529 } 00:26:52.529 ], 00:26:52.529 "driver_specific": { 00:26:52.529 "raid": { 00:26:52.529 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:52.529 "strip_size_kb": 0, 00:26:52.529 "state": "online", 00:26:52.529 "raid_level": "raid1", 00:26:52.529 "superblock": true, 00:26:52.529 "num_base_bdevs": 2, 00:26:52.529 "num_base_bdevs_discovered": 2, 00:26:52.529 "num_base_bdevs_operational": 2, 00:26:52.529 "base_bdevs_list": [ 00:26:52.529 { 00:26:52.529 "name": "pt1", 00:26:52.529 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:52.529 "is_configured": true, 00:26:52.529 "data_offset": 256, 00:26:52.529 "data_size": 7936 00:26:52.529 }, 00:26:52.529 { 00:26:52.529 "name": "pt2", 00:26:52.529 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:52.529 "is_configured": true, 00:26:52.529 "data_offset": 256, 00:26:52.529 "data_size": 7936 00:26:52.529 } 00:26:52.529 ] 00:26:52.529 } 00:26:52.529 } 00:26:52.529 }' 00:26:52.529 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:52.529 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:52.529 pt2' 00:26:52.529 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:52.529 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:52.529 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:53.098 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:53.098 "name": "pt1", 00:26:53.098 "aliases": [ 00:26:53.098 "00000000-0000-0000-0000-000000000001" 00:26:53.098 ], 00:26:53.098 "product_name": "passthru", 00:26:53.098 "block_size": 4096, 00:26:53.098 "num_blocks": 8192, 00:26:53.098 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:53.098 "md_size": 32, 00:26:53.098 "md_interleave": false, 00:26:53.098 "dif_type": 0, 00:26:53.098 "assigned_rate_limits": { 00:26:53.098 "rw_ios_per_sec": 0, 00:26:53.098 "rw_mbytes_per_sec": 0, 00:26:53.098 "r_mbytes_per_sec": 0, 00:26:53.098 "w_mbytes_per_sec": 0 00:26:53.098 }, 00:26:53.098 "claimed": true, 00:26:53.098 "claim_type": "exclusive_write", 00:26:53.098 "zoned": false, 00:26:53.098 "supported_io_types": { 00:26:53.098 "read": true, 00:26:53.098 "write": true, 00:26:53.098 "unmap": true, 00:26:53.098 "flush": true, 00:26:53.098 "reset": true, 00:26:53.098 "nvme_admin": false, 00:26:53.098 "nvme_io": false, 00:26:53.098 "nvme_io_md": false, 00:26:53.098 "write_zeroes": true, 00:26:53.098 "zcopy": true, 00:26:53.098 "get_zone_info": false, 00:26:53.098 "zone_management": false, 00:26:53.098 "zone_append": false, 00:26:53.098 "compare": false, 00:26:53.098 "compare_and_write": false, 00:26:53.098 "abort": true, 00:26:53.098 "seek_hole": false, 00:26:53.098 "seek_data": false, 00:26:53.098 "copy": true, 00:26:53.098 "nvme_iov_md": false 00:26:53.098 }, 00:26:53.098 "memory_domains": [ 00:26:53.098 { 00:26:53.098 "dma_device_id": "system", 00:26:53.098 "dma_device_type": 1 00:26:53.098 }, 00:26:53.098 { 00:26:53.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:53.098 "dma_device_type": 2 00:26:53.098 } 00:26:53.098 ], 00:26:53.098 "driver_specific": { 00:26:53.098 "passthru": { 00:26:53.098 "name": "pt1", 00:26:53.098 "base_bdev_name": "malloc1" 00:26:53.098 } 00:26:53.098 } 00:26:53.098 }' 00:26:53.098 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:53.358 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:53.358 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:53.358 13:35:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:53.358 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:53.358 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:53.358 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:53.358 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:53.619 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:53.619 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:53.619 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:53.619 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:53.619 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:53.619 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:53.619 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:54.188 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:54.188 "name": "pt2", 00:26:54.188 "aliases": [ 00:26:54.188 "00000000-0000-0000-0000-000000000002" 00:26:54.188 ], 00:26:54.188 "product_name": "passthru", 00:26:54.188 "block_size": 4096, 00:26:54.188 "num_blocks": 8192, 00:26:54.188 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:54.188 "md_size": 32, 00:26:54.188 "md_interleave": false, 00:26:54.188 "dif_type": 0, 00:26:54.188 "assigned_rate_limits": { 00:26:54.188 "rw_ios_per_sec": 0, 00:26:54.188 "rw_mbytes_per_sec": 0, 00:26:54.188 "r_mbytes_per_sec": 0, 00:26:54.188 "w_mbytes_per_sec": 0 00:26:54.188 }, 00:26:54.188 "claimed": true, 00:26:54.188 "claim_type": "exclusive_write", 00:26:54.188 "zoned": false, 00:26:54.188 "supported_io_types": { 00:26:54.188 "read": true, 00:26:54.188 "write": true, 00:26:54.188 "unmap": true, 00:26:54.188 "flush": true, 00:26:54.188 "reset": true, 00:26:54.188 "nvme_admin": false, 00:26:54.188 "nvme_io": false, 00:26:54.188 "nvme_io_md": false, 00:26:54.188 "write_zeroes": true, 00:26:54.188 "zcopy": true, 00:26:54.188 "get_zone_info": false, 00:26:54.188 "zone_management": false, 00:26:54.188 "zone_append": false, 00:26:54.188 "compare": false, 00:26:54.188 "compare_and_write": false, 00:26:54.188 "abort": true, 00:26:54.188 "seek_hole": false, 00:26:54.188 "seek_data": false, 00:26:54.188 "copy": true, 00:26:54.188 "nvme_iov_md": false 00:26:54.188 }, 00:26:54.188 "memory_domains": [ 00:26:54.188 { 00:26:54.188 "dma_device_id": "system", 00:26:54.188 "dma_device_type": 1 00:26:54.188 }, 00:26:54.188 { 00:26:54.188 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:54.188 "dma_device_type": 2 00:26:54.188 } 00:26:54.188 ], 00:26:54.188 "driver_specific": { 00:26:54.188 "passthru": { 00:26:54.188 "name": "pt2", 00:26:54.188 "base_bdev_name": "malloc2" 00:26:54.188 } 00:26:54.188 } 00:26:54.188 }' 00:26:54.188 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:54.188 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:54.188 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:54.188 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:54.448 13:35:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:54.448 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:54.449 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:26:54.709 [2024-07-25 13:35:35.394346] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:54.709 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@502 -- # '[' 6fc26fb7-d700-4a0f-919b-9d41a419c1c4 '!=' 6fc26fb7-d700-4a0f-919b-9d41a419c1c4 ']' 00:26:54.709 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:26:54.709 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:54.709 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:26:54.709 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:55.279 [2024-07-25 13:35:35.927513] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:55.279 13:35:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:55.538 13:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:55.538 "name": "raid_bdev1", 00:26:55.538 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:55.538 "strip_size_kb": 0, 00:26:55.538 "state": "online", 00:26:55.538 "raid_level": "raid1", 00:26:55.538 "superblock": true, 00:26:55.538 "num_base_bdevs": 2, 00:26:55.538 "num_base_bdevs_discovered": 1, 00:26:55.538 "num_base_bdevs_operational": 1, 00:26:55.538 "base_bdevs_list": [ 00:26:55.538 { 00:26:55.538 "name": null, 00:26:55.538 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:55.538 "is_configured": false, 00:26:55.538 "data_offset": 256, 00:26:55.538 "data_size": 7936 00:26:55.538 }, 00:26:55.538 { 00:26:55.538 "name": "pt2", 00:26:55.538 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:55.538 "is_configured": true, 00:26:55.538 "data_offset": 256, 00:26:55.538 "data_size": 7936 00:26:55.538 } 00:26:55.538 ] 00:26:55.538 }' 00:26:55.538 13:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:55.538 13:35:36 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:56.109 13:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:56.109 [2024-07-25 13:35:36.869886] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:56.109 [2024-07-25 13:35:36.869902] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:56.109 [2024-07-25 13:35:36.869934] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:56.109 [2024-07-25 13:35:36.869963] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:56.109 [2024-07-25 13:35:36.869969] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2765010 name raid_bdev1, state offline 00:26:56.109 13:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.109 13:35:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:26:56.369 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:26:56.369 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:26:56.369 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:26:56.369 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:26:56.369 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:56.629 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:26:56.629 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:26:56.629 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:26:56.629 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:26:56.629 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@534 -- # i=1 00:26:56.629 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:56.890 [2024-07-25 13:35:37.431282] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:56.890 [2024-07-25 13:35:37.431310] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:56.890 [2024-07-25 13:35:37.431320] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25d99e0 00:26:56.890 [2024-07-25 13:35:37.431326] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:56.890 [2024-07-25 13:35:37.432461] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:56.890 [2024-07-25 13:35:37.432480] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:56.890 [2024-07-25 13:35:37.432512] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:56.890 [2024-07-25 13:35:37.432531] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:56.890 [2024-07-25 13:35:37.432596] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2765330 00:26:56.890 [2024-07-25 13:35:37.432602] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:56.890 [2024-07-25 13:35:37.432643] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2766d70 00:26:56.890 [2024-07-25 13:35:37.432721] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2765330 00:26:56.891 [2024-07-25 13:35:37.432731] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2765330 00:26:56.891 [2024-07-25 13:35:37.432780] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:56.891 pt2 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:56.891 "name": "raid_bdev1", 00:26:56.891 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:56.891 "strip_size_kb": 0, 00:26:56.891 "state": "online", 00:26:56.891 "raid_level": "raid1", 00:26:56.891 "superblock": true, 00:26:56.891 "num_base_bdevs": 2, 00:26:56.891 "num_base_bdevs_discovered": 1, 00:26:56.891 "num_base_bdevs_operational": 1, 00:26:56.891 "base_bdevs_list": [ 00:26:56.891 { 00:26:56.891 "name": null, 00:26:56.891 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:56.891 "is_configured": false, 00:26:56.891 "data_offset": 256, 00:26:56.891 "data_size": 7936 00:26:56.891 }, 00:26:56.891 { 00:26:56.891 "name": "pt2", 00:26:56.891 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:56.891 "is_configured": true, 00:26:56.891 "data_offset": 256, 00:26:56.891 "data_size": 7936 00:26:56.891 } 00:26:56.891 ] 00:26:56.891 }' 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:56.891 13:35:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:57.461 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:57.721 [2024-07-25 13:35:38.377676] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:57.721 [2024-07-25 13:35:38.377690] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:57.721 [2024-07-25 13:35:38.377720] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:57.721 [2024-07-25 13:35:38.377748] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:57.721 [2024-07-25 13:35:38.377754] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2765330 name raid_bdev1, state offline 00:26:57.721 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.721 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:26:57.981 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:26:57.981 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:26:57.981 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:26:57.981 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:57.981 [2024-07-25 13:35:38.754615] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:57.981 [2024-07-25 13:35:38.754638] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:57.981 [2024-07-25 13:35:38.754647] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x277eea0 00:26:57.981 [2024-07-25 13:35:38.754653] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:57.981 [2024-07-25 13:35:38.755764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:57.981 [2024-07-25 13:35:38.755781] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:57.981 [2024-07-25 13:35:38.755810] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:57.981 [2024-07-25 13:35:38.755826] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:57.981 [2024-07-25 13:35:38.755894] bdev_raid.c:3665:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:57.981 [2024-07-25 13:35:38.755901] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:57.981 [2024-07-25 13:35:38.755909] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2764680 name raid_bdev1, state configuring 00:26:57.981 [2024-07-25 13:35:38.755921] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:57.981 [2024-07-25 13:35:38.755957] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x2766b00 00:26:57.981 [2024-07-25 13:35:38.755962] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:57.981 [2024-07-25 13:35:38.755996] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27649a0 00:26:57.981 [2024-07-25 13:35:38.756072] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2766b00 00:26:57.981 [2024-07-25 13:35:38.756077] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2766b00 00:26:57.981 [2024-07-25 13:35:38.756130] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.981 pt1 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:58.242 "name": "raid_bdev1", 00:26:58.242 "uuid": "6fc26fb7-d700-4a0f-919b-9d41a419c1c4", 00:26:58.242 "strip_size_kb": 0, 00:26:58.242 "state": "online", 00:26:58.242 "raid_level": "raid1", 00:26:58.242 "superblock": true, 00:26:58.242 "num_base_bdevs": 2, 00:26:58.242 "num_base_bdevs_discovered": 1, 00:26:58.242 "num_base_bdevs_operational": 1, 00:26:58.242 "base_bdevs_list": [ 00:26:58.242 { 00:26:58.242 "name": null, 00:26:58.242 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:58.242 "is_configured": false, 00:26:58.242 "data_offset": 256, 00:26:58.242 "data_size": 7936 00:26:58.242 }, 00:26:58.242 { 00:26:58.242 "name": "pt2", 00:26:58.242 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:58.242 "is_configured": true, 00:26:58.242 "data_offset": 256, 00:26:58.242 "data_size": 7936 00:26:58.242 } 00:26:58.242 ] 00:26:58.242 }' 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:58.242 13:35:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:58.812 13:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:58.812 13:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:59.072 13:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:26:59.072 13:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:59.072 13:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:26:59.333 [2024-07-25 13:35:39.877656] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@573 -- # '[' 6fc26fb7-d700-4a0f-919b-9d41a419c1c4 '!=' 6fc26fb7-d700-4a0f-919b-9d41a419c1c4 ']' 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@578 -- # killprocess 1044625 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1044625 ']' 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # kill -0 1044625 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # uname 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1044625 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1044625' 00:26:59.333 killing process with pid 1044625 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@969 -- # kill 1044625 00:26:59.333 [2024-07-25 13:35:39.947111] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:59.333 [2024-07-25 13:35:39.947146] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:59.333 [2024-07-25 13:35:39.947175] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:59.333 [2024-07-25 13:35:39.947181] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2766b00 name raid_bdev1, state offline 00:26:59.333 13:35:39 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@974 -- # wait 1044625 00:26:59.333 [2024-07-25 13:35:39.959628] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:59.333 13:35:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@580 -- # return 0 00:26:59.333 00:26:59.333 real 0m19.874s 00:26:59.333 user 0m37.261s 00:26:59.333 sys 0m2.565s 00:26:59.333 13:35:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:26:59.333 13:35:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:59.333 ************************************ 00:26:59.333 END TEST raid_superblock_test_md_separate 00:26:59.333 ************************************ 00:26:59.333 13:35:40 bdev_raid -- bdev/bdev_raid.sh@987 -- # '[' true = true ']' 00:26:59.333 13:35:40 bdev_raid -- bdev/bdev_raid.sh@988 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:26:59.333 13:35:40 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:26:59.333 13:35:40 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:26:59.333 13:35:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:59.594 ************************************ 00:26:59.594 START TEST raid_rebuild_test_sb_md_separate 00:26:59.594 ************************************ 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false true 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # local verify=true 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # local strip_size 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # local create_arg 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@594 -- # local data_offset 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # raid_pid=1048297 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@613 -- # waitforlisten 1048297 /var/tmp/spdk-raid.sock 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@831 -- # '[' -z 1048297 ']' 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@836 -- # local max_retries=100 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:59.594 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@840 -- # xtrace_disable 00:26:59.594 13:35:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:59.594 [2024-07-25 13:35:40.222634] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:26:59.594 [2024-07-25 13:35:40.222691] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1048297 ] 00:26:59.594 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:59.594 Zero copy mechanism will not be used. 00:26:59.594 [2024-07-25 13:35:40.312963] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:59.594 [2024-07-25 13:35:40.380613] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:26:59.854 [2024-07-25 13:35:40.420667] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:59.854 [2024-07-25 13:35:40.420691] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:00.796 13:35:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:00.796 13:35:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@864 -- # return 0 00:27:00.796 13:35:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:00.796 13:35:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:27:01.056 BaseBdev1_malloc 00:27:01.056 13:35:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:01.626 [2024-07-25 13:35:42.120689] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:01.626 [2024-07-25 13:35:42.120724] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:01.626 [2024-07-25 13:35:42.120739] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9af060 00:27:01.626 [2024-07-25 13:35:42.120745] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:01.626 [2024-07-25 13:35:42.121943] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:01.626 [2024-07-25 13:35:42.121963] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:01.626 BaseBdev1 00:27:01.626 13:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:01.626 13:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:27:01.626 BaseBdev2_malloc 00:27:01.626 13:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:02.196 [2024-07-25 13:35:42.857121] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:02.196 [2024-07-25 13:35:42.857152] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:02.196 [2024-07-25 13:35:42.857164] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9af7a0 00:27:02.196 [2024-07-25 13:35:42.857171] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:02.196 [2024-07-25 13:35:42.858257] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:02.196 [2024-07-25 13:35:42.858276] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:02.196 BaseBdev2 00:27:02.196 13:35:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:27:02.456 spare_malloc 00:27:02.456 13:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:02.720 spare_delay 00:27:02.720 13:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:03.291 [2024-07-25 13:35:43.785811] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:03.291 [2024-07-25 13:35:43.785838] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:03.291 [2024-07-25 13:35:43.785852] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x80cf10 00:27:03.291 [2024-07-25 13:35:43.785858] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:03.291 [2024-07-25 13:35:43.786952] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:03.291 [2024-07-25 13:35:43.786971] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:03.291 spare 00:27:03.291 13:35:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:03.291 [2024-07-25 13:35:43.994363] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:03.291 [2024-07-25 13:35:43.995366] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:03.291 [2024-07-25 13:35:43.995470] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x80ea10 00:27:03.291 [2024-07-25 13:35:43.995478] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:03.291 [2024-07-25 13:35:43.995534] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x80b620 00:27:03.291 [2024-07-25 13:35:43.995627] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x80ea10 00:27:03.291 [2024-07-25 13:35:43.995633] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x80ea10 00:27:03.291 [2024-07-25 13:35:43.995692] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.291 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.551 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:03.551 "name": "raid_bdev1", 00:27:03.551 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:03.551 "strip_size_kb": 0, 00:27:03.551 "state": "online", 00:27:03.551 "raid_level": "raid1", 00:27:03.551 "superblock": true, 00:27:03.551 "num_base_bdevs": 2, 00:27:03.551 "num_base_bdevs_discovered": 2, 00:27:03.551 "num_base_bdevs_operational": 2, 00:27:03.551 "base_bdevs_list": [ 00:27:03.551 { 00:27:03.551 "name": "BaseBdev1", 00:27:03.551 "uuid": "afbe8b0f-2f38-55e5-8bfa-f8a6089a610b", 00:27:03.551 "is_configured": true, 00:27:03.551 "data_offset": 256, 00:27:03.551 "data_size": 7936 00:27:03.551 }, 00:27:03.551 { 00:27:03.551 "name": "BaseBdev2", 00:27:03.551 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:03.551 "is_configured": true, 00:27:03.551 "data_offset": 256, 00:27:03.551 "data_size": 7936 00:27:03.551 } 00:27:03.551 ] 00:27:03.551 }' 00:27:03.551 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:03.551 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:04.122 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:04.122 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:27:04.383 [2024-07-25 13:35:44.944938] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:04.383 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:27:04.383 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:04.383 13:35:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # '[' true = true ']' 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@640 -- # local write_unit_size 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@643 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:04.383 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:27:04.642 [2024-07-25 13:35:45.329740] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x80b620 00:27:04.642 /dev/nbd0 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:04.642 1+0 records in 00:27:04.642 1+0 records out 00:27:04.642 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301405 s, 13.6 MB/s 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:04.642 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:27:04.643 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:04.643 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:27:04.643 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@644 -- # '[' raid1 = raid5f ']' 00:27:04.643 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@648 -- # write_unit_size=1 00:27:04.643 13:35:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@650 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:27:05.582 7936+0 records in 00:27:05.582 7936+0 records out 00:27:05.582 32505856 bytes (33 MB, 31 MiB) copied, 0.653332 s, 49.8 MB/s 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@651 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:05.582 [2024-07-25 13:35:46.236884] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:05.582 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:05.842 [2024-07-25 13:35:46.413351] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:05.842 "name": "raid_bdev1", 00:27:05.842 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:05.842 "strip_size_kb": 0, 00:27:05.842 "state": "online", 00:27:05.842 "raid_level": "raid1", 00:27:05.842 "superblock": true, 00:27:05.842 "num_base_bdevs": 2, 00:27:05.842 "num_base_bdevs_discovered": 1, 00:27:05.842 "num_base_bdevs_operational": 1, 00:27:05.842 "base_bdevs_list": [ 00:27:05.842 { 00:27:05.842 "name": null, 00:27:05.842 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.842 "is_configured": false, 00:27:05.842 "data_offset": 256, 00:27:05.842 "data_size": 7936 00:27:05.842 }, 00:27:05.842 { 00:27:05.842 "name": "BaseBdev2", 00:27:05.842 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:05.842 "is_configured": true, 00:27:05.842 "data_offset": 256, 00:27:05.842 "data_size": 7936 00:27:05.842 } 00:27:05.842 ] 00:27:05.842 }' 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:05.842 13:35:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:06.783 13:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:07.042 [2024-07-25 13:35:47.672542] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:07.042 [2024-07-25 13:35:47.674168] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x811c50 00:27:07.042 [2024-07-25 13:35:47.675717] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:07.042 13:35:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:07.982 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:07.982 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:07.982 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:07.982 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:07.982 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:07.982 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:07.982 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.242 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:08.242 "name": "raid_bdev1", 00:27:08.242 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:08.242 "strip_size_kb": 0, 00:27:08.242 "state": "online", 00:27:08.242 "raid_level": "raid1", 00:27:08.242 "superblock": true, 00:27:08.242 "num_base_bdevs": 2, 00:27:08.242 "num_base_bdevs_discovered": 2, 00:27:08.242 "num_base_bdevs_operational": 2, 00:27:08.242 "process": { 00:27:08.242 "type": "rebuild", 00:27:08.242 "target": "spare", 00:27:08.242 "progress": { 00:27:08.242 "blocks": 2816, 00:27:08.242 "percent": 35 00:27:08.242 } 00:27:08.242 }, 00:27:08.242 "base_bdevs_list": [ 00:27:08.242 { 00:27:08.242 "name": "spare", 00:27:08.242 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:08.242 "is_configured": true, 00:27:08.242 "data_offset": 256, 00:27:08.242 "data_size": 7936 00:27:08.242 }, 00:27:08.242 { 00:27:08.242 "name": "BaseBdev2", 00:27:08.242 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:08.242 "is_configured": true, 00:27:08.242 "data_offset": 256, 00:27:08.242 "data_size": 7936 00:27:08.242 } 00:27:08.242 ] 00:27:08.242 }' 00:27:08.242 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:08.242 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:08.242 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:08.242 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:08.242 13:35:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:08.503 [2024-07-25 13:35:49.157551] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:08.503 [2024-07-25 13:35:49.184635] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:08.503 [2024-07-25 13:35:49.184667] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:08.503 [2024-07-25 13:35:49.184677] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:08.503 [2024-07-25 13:35:49.184681] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.503 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:08.762 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.762 "name": "raid_bdev1", 00:27:08.762 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:08.762 "strip_size_kb": 0, 00:27:08.762 "state": "online", 00:27:08.762 "raid_level": "raid1", 00:27:08.762 "superblock": true, 00:27:08.762 "num_base_bdevs": 2, 00:27:08.762 "num_base_bdevs_discovered": 1, 00:27:08.762 "num_base_bdevs_operational": 1, 00:27:08.762 "base_bdevs_list": [ 00:27:08.762 { 00:27:08.762 "name": null, 00:27:08.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.762 "is_configured": false, 00:27:08.762 "data_offset": 256, 00:27:08.762 "data_size": 7936 00:27:08.762 }, 00:27:08.762 { 00:27:08.762 "name": "BaseBdev2", 00:27:08.762 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:08.762 "is_configured": true, 00:27:08.762 "data_offset": 256, 00:27:08.762 "data_size": 7936 00:27:08.762 } 00:27:08.762 ] 00:27:08.762 }' 00:27:08.762 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.762 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:09.333 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:09.333 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:09.333 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:09.333 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:09.333 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:09.333 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:09.333 13:35:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:09.333 13:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:09.334 "name": "raid_bdev1", 00:27:09.334 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:09.334 "strip_size_kb": 0, 00:27:09.334 "state": "online", 00:27:09.334 "raid_level": "raid1", 00:27:09.334 "superblock": true, 00:27:09.334 "num_base_bdevs": 2, 00:27:09.334 "num_base_bdevs_discovered": 1, 00:27:09.334 "num_base_bdevs_operational": 1, 00:27:09.334 "base_bdevs_list": [ 00:27:09.334 { 00:27:09.334 "name": null, 00:27:09.334 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:09.334 "is_configured": false, 00:27:09.334 "data_offset": 256, 00:27:09.334 "data_size": 7936 00:27:09.334 }, 00:27:09.334 { 00:27:09.334 "name": "BaseBdev2", 00:27:09.334 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:09.334 "is_configured": true, 00:27:09.334 "data_offset": 256, 00:27:09.334 "data_size": 7936 00:27:09.334 } 00:27:09.334 ] 00:27:09.334 }' 00:27:09.334 13:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:09.594 13:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:09.594 13:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:09.594 13:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:09.594 13:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:09.594 [2024-07-25 13:35:50.385609] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:09.856 [2024-07-25 13:35:50.387176] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8118b0 00:27:09.856 [2024-07-25 13:35:50.388303] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:09.856 13:35:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@678 -- # sleep 1 00:27:10.797 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:10.797 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:10.797 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:10.797 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:10.797 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:10.797 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.797 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:11.062 "name": "raid_bdev1", 00:27:11.062 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:11.062 "strip_size_kb": 0, 00:27:11.062 "state": "online", 00:27:11.062 "raid_level": "raid1", 00:27:11.062 "superblock": true, 00:27:11.062 "num_base_bdevs": 2, 00:27:11.062 "num_base_bdevs_discovered": 2, 00:27:11.062 "num_base_bdevs_operational": 2, 00:27:11.062 "process": { 00:27:11.062 "type": "rebuild", 00:27:11.062 "target": "spare", 00:27:11.062 "progress": { 00:27:11.062 "blocks": 2816, 00:27:11.062 "percent": 35 00:27:11.062 } 00:27:11.062 }, 00:27:11.062 "base_bdevs_list": [ 00:27:11.062 { 00:27:11.062 "name": "spare", 00:27:11.062 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:11.062 "is_configured": true, 00:27:11.062 "data_offset": 256, 00:27:11.062 "data_size": 7936 00:27:11.062 }, 00:27:11.062 { 00:27:11.062 "name": "BaseBdev2", 00:27:11.062 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:11.062 "is_configured": true, 00:27:11.062 "data_offset": 256, 00:27:11.062 "data_size": 7936 00:27:11.062 } 00:27:11.062 ] 00:27:11.062 }' 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:27:11.062 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # local timeout=1008 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.062 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:11.322 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:11.322 "name": "raid_bdev1", 00:27:11.322 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:11.322 "strip_size_kb": 0, 00:27:11.322 "state": "online", 00:27:11.322 "raid_level": "raid1", 00:27:11.322 "superblock": true, 00:27:11.322 "num_base_bdevs": 2, 00:27:11.322 "num_base_bdevs_discovered": 2, 00:27:11.322 "num_base_bdevs_operational": 2, 00:27:11.322 "process": { 00:27:11.322 "type": "rebuild", 00:27:11.322 "target": "spare", 00:27:11.322 "progress": { 00:27:11.322 "blocks": 3584, 00:27:11.322 "percent": 45 00:27:11.322 } 00:27:11.322 }, 00:27:11.322 "base_bdevs_list": [ 00:27:11.322 { 00:27:11.322 "name": "spare", 00:27:11.322 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:11.322 "is_configured": true, 00:27:11.322 "data_offset": 256, 00:27:11.322 "data_size": 7936 00:27:11.322 }, 00:27:11.322 { 00:27:11.322 "name": "BaseBdev2", 00:27:11.322 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:11.322 "is_configured": true, 00:27:11.322 "data_offset": 256, 00:27:11.322 "data_size": 7936 00:27:11.322 } 00:27:11.322 ] 00:27:11.322 }' 00:27:11.322 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:11.322 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:11.322 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:11.322 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:11.322 13:35:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:12.261 13:35:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:12.521 13:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:12.521 "name": "raid_bdev1", 00:27:12.521 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:12.521 "strip_size_kb": 0, 00:27:12.521 "state": "online", 00:27:12.521 "raid_level": "raid1", 00:27:12.521 "superblock": true, 00:27:12.521 "num_base_bdevs": 2, 00:27:12.521 "num_base_bdevs_discovered": 2, 00:27:12.521 "num_base_bdevs_operational": 2, 00:27:12.521 "process": { 00:27:12.521 "type": "rebuild", 00:27:12.521 "target": "spare", 00:27:12.521 "progress": { 00:27:12.521 "blocks": 6912, 00:27:12.521 "percent": 87 00:27:12.521 } 00:27:12.521 }, 00:27:12.521 "base_bdevs_list": [ 00:27:12.521 { 00:27:12.521 "name": "spare", 00:27:12.521 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:12.521 "is_configured": true, 00:27:12.521 "data_offset": 256, 00:27:12.521 "data_size": 7936 00:27:12.521 }, 00:27:12.521 { 00:27:12.521 "name": "BaseBdev2", 00:27:12.521 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:12.521 "is_configured": true, 00:27:12.521 "data_offset": 256, 00:27:12.521 "data_size": 7936 00:27:12.521 } 00:27:12.521 ] 00:27:12.521 }' 00:27:12.521 13:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:12.521 13:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:12.521 13:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:12.521 13:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:12.521 13:35:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@726 -- # sleep 1 00:27:12.781 [2024-07-25 13:35:53.506631] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:12.781 [2024-07-25 13:35:53.506673] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:12.781 [2024-07-25 13:35:53.506737] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.722 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:13.722 "name": "raid_bdev1", 00:27:13.722 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:13.722 "strip_size_kb": 0, 00:27:13.722 "state": "online", 00:27:13.722 "raid_level": "raid1", 00:27:13.722 "superblock": true, 00:27:13.722 "num_base_bdevs": 2, 00:27:13.722 "num_base_bdevs_discovered": 2, 00:27:13.722 "num_base_bdevs_operational": 2, 00:27:13.722 "base_bdevs_list": [ 00:27:13.722 { 00:27:13.722 "name": "spare", 00:27:13.722 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:13.723 "is_configured": true, 00:27:13.723 "data_offset": 256, 00:27:13.723 "data_size": 7936 00:27:13.723 }, 00:27:13.723 { 00:27:13.723 "name": "BaseBdev2", 00:27:13.723 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:13.723 "is_configured": true, 00:27:13.723 "data_offset": 256, 00:27:13.723 "data_size": 7936 00:27:13.723 } 00:27:13.723 ] 00:27:13.723 }' 00:27:13.723 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@724 -- # break 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:13.981 "name": "raid_bdev1", 00:27:13.981 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:13.981 "strip_size_kb": 0, 00:27:13.981 "state": "online", 00:27:13.981 "raid_level": "raid1", 00:27:13.981 "superblock": true, 00:27:13.981 "num_base_bdevs": 2, 00:27:13.981 "num_base_bdevs_discovered": 2, 00:27:13.981 "num_base_bdevs_operational": 2, 00:27:13.981 "base_bdevs_list": [ 00:27:13.981 { 00:27:13.981 "name": "spare", 00:27:13.981 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:13.981 "is_configured": true, 00:27:13.981 "data_offset": 256, 00:27:13.981 "data_size": 7936 00:27:13.981 }, 00:27:13.981 { 00:27:13.981 "name": "BaseBdev2", 00:27:13.981 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:13.981 "is_configured": true, 00:27:13.981 "data_offset": 256, 00:27:13.981 "data_size": 7936 00:27:13.981 } 00:27:13.981 ] 00:27:13.981 }' 00:27:13.981 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:14.243 13:35:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:14.243 13:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:14.243 "name": "raid_bdev1", 00:27:14.243 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:14.243 "strip_size_kb": 0, 00:27:14.243 "state": "online", 00:27:14.243 "raid_level": "raid1", 00:27:14.243 "superblock": true, 00:27:14.243 "num_base_bdevs": 2, 00:27:14.243 "num_base_bdevs_discovered": 2, 00:27:14.243 "num_base_bdevs_operational": 2, 00:27:14.243 "base_bdevs_list": [ 00:27:14.243 { 00:27:14.243 "name": "spare", 00:27:14.243 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:14.243 "is_configured": true, 00:27:14.243 "data_offset": 256, 00:27:14.243 "data_size": 7936 00:27:14.243 }, 00:27:14.243 { 00:27:14.243 "name": "BaseBdev2", 00:27:14.243 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:14.243 "is_configured": true, 00:27:14.243 "data_offset": 256, 00:27:14.243 "data_size": 7936 00:27:14.243 } 00:27:14.243 ] 00:27:14.243 }' 00:27:14.243 13:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:14.243 13:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:14.891 13:35:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:15.460 [2024-07-25 13:35:56.055034] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:15.460 [2024-07-25 13:35:56.055052] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:15.460 [2024-07-25 13:35:56.055089] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:15.460 [2024-07-25 13:35:56.055128] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:15.460 [2024-07-25 13:35:56.055134] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x80ea10 name raid_bdev1, state offline 00:27:15.460 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.460 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # jq length 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # '[' true = true ']' 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # '[' false = true ']' 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:15.720 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:27:16.289 /dev/nbd0 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:16.289 1+0 records in 00:27:16.289 1+0 records out 00:27:16.289 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272591 s, 15.0 MB/s 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:16.289 13:35:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:27:16.289 /dev/nbd1 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@869 -- # local i 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@873 -- # break 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:27:16.289 1+0 records in 00:27:16.289 1+0 records out 00:27:16.289 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000251392 s, 16.3 MB/s 00:27:16.289 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # size=4096 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@889 -- # return 0 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:27:16.549 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:27:16.809 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:17.069 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:17.329 [2024-07-25 13:35:57.923221] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:17.329 [2024-07-25 13:35:57.923250] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:17.329 [2024-07-25 13:35:57.923262] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x811d50 00:27:17.329 [2024-07-25 13:35:57.923269] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:17.329 [2024-07-25 13:35:57.924439] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:17.329 [2024-07-25 13:35:57.924461] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:17.329 [2024-07-25 13:35:57.924505] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:17.329 [2024-07-25 13:35:57.924524] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:17.329 [2024-07-25 13:35:57.924607] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:17.329 spare 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.329 13:35:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:17.329 [2024-07-25 13:35:58.024891] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x80f590 00:27:17.329 [2024-07-25 13:35:58.024901] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:27:17.329 [2024-07-25 13:35:58.024950] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8118b0 00:27:17.329 [2024-07-25 13:35:58.025044] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x80f590 00:27:17.329 [2024-07-25 13:35:58.025049] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x80f590 00:27:17.329 [2024-07-25 13:35:58.025103] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:17.588 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:17.588 "name": "raid_bdev1", 00:27:17.588 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:17.588 "strip_size_kb": 0, 00:27:17.588 "state": "online", 00:27:17.588 "raid_level": "raid1", 00:27:17.588 "superblock": true, 00:27:17.588 "num_base_bdevs": 2, 00:27:17.588 "num_base_bdevs_discovered": 2, 00:27:17.588 "num_base_bdevs_operational": 2, 00:27:17.588 "base_bdevs_list": [ 00:27:17.588 { 00:27:17.588 "name": "spare", 00:27:17.588 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:17.588 "is_configured": true, 00:27:17.588 "data_offset": 256, 00:27:17.588 "data_size": 7936 00:27:17.588 }, 00:27:17.588 { 00:27:17.588 "name": "BaseBdev2", 00:27:17.588 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:17.588 "is_configured": true, 00:27:17.588 "data_offset": 256, 00:27:17.588 "data_size": 7936 00:27:17.588 } 00:27:17.588 ] 00:27:17.588 }' 00:27:17.588 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:17.588 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:18.158 "name": "raid_bdev1", 00:27:18.158 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:18.158 "strip_size_kb": 0, 00:27:18.158 "state": "online", 00:27:18.158 "raid_level": "raid1", 00:27:18.158 "superblock": true, 00:27:18.158 "num_base_bdevs": 2, 00:27:18.158 "num_base_bdevs_discovered": 2, 00:27:18.158 "num_base_bdevs_operational": 2, 00:27:18.158 "base_bdevs_list": [ 00:27:18.158 { 00:27:18.158 "name": "spare", 00:27:18.158 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:18.158 "is_configured": true, 00:27:18.158 "data_offset": 256, 00:27:18.158 "data_size": 7936 00:27:18.158 }, 00:27:18.158 { 00:27:18.158 "name": "BaseBdev2", 00:27:18.158 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:18.158 "is_configured": true, 00:27:18.158 "data_offset": 256, 00:27:18.158 "data_size": 7936 00:27:18.158 } 00:27:18.158 ] 00:27:18.158 }' 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:18.158 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:18.417 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:18.417 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.417 13:35:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:18.417 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:27:18.417 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:18.677 [2024-07-25 13:35:59.350901] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:18.677 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:18.936 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:18.936 "name": "raid_bdev1", 00:27:18.936 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:18.936 "strip_size_kb": 0, 00:27:18.936 "state": "online", 00:27:18.936 "raid_level": "raid1", 00:27:18.936 "superblock": true, 00:27:18.936 "num_base_bdevs": 2, 00:27:18.936 "num_base_bdevs_discovered": 1, 00:27:18.936 "num_base_bdevs_operational": 1, 00:27:18.936 "base_bdevs_list": [ 00:27:18.936 { 00:27:18.936 "name": null, 00:27:18.936 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:18.936 "is_configured": false, 00:27:18.936 "data_offset": 256, 00:27:18.936 "data_size": 7936 00:27:18.936 }, 00:27:18.936 { 00:27:18.936 "name": "BaseBdev2", 00:27:18.936 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:18.936 "is_configured": true, 00:27:18.936 "data_offset": 256, 00:27:18.936 "data_size": 7936 00:27:18.936 } 00:27:18.936 ] 00:27:18.936 }' 00:27:18.936 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:18.936 13:35:59 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:19.504 13:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:19.504 [2024-07-25 13:36:00.229142] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:19.504 [2024-07-25 13:36:00.229257] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:19.504 [2024-07-25 13:36:00.229267] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:19.504 [2024-07-25 13:36:00.229285] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:19.504 [2024-07-25 13:36:00.230773] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x80b620 00:27:19.504 [2024-07-25 13:36:00.231882] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:19.504 13:36:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # sleep 1 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:20.895 "name": "raid_bdev1", 00:27:20.895 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:20.895 "strip_size_kb": 0, 00:27:20.895 "state": "online", 00:27:20.895 "raid_level": "raid1", 00:27:20.895 "superblock": true, 00:27:20.895 "num_base_bdevs": 2, 00:27:20.895 "num_base_bdevs_discovered": 2, 00:27:20.895 "num_base_bdevs_operational": 2, 00:27:20.895 "process": { 00:27:20.895 "type": "rebuild", 00:27:20.895 "target": "spare", 00:27:20.895 "progress": { 00:27:20.895 "blocks": 2816, 00:27:20.895 "percent": 35 00:27:20.895 } 00:27:20.895 }, 00:27:20.895 "base_bdevs_list": [ 00:27:20.895 { 00:27:20.895 "name": "spare", 00:27:20.895 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:20.895 "is_configured": true, 00:27:20.895 "data_offset": 256, 00:27:20.895 "data_size": 7936 00:27:20.895 }, 00:27:20.895 { 00:27:20.895 "name": "BaseBdev2", 00:27:20.895 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:20.895 "is_configured": true, 00:27:20.895 "data_offset": 256, 00:27:20.895 "data_size": 7936 00:27:20.895 } 00:27:20.895 ] 00:27:20.895 }' 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:20.895 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:21.155 [2024-07-25 13:36:01.713395] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:21.155 [2024-07-25 13:36:01.740743] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:21.155 [2024-07-25 13:36:01.740775] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:21.155 [2024-07-25 13:36:01.740784] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:21.155 [2024-07-25 13:36:01.740789] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:21.155 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:21.414 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:21.414 "name": "raid_bdev1", 00:27:21.414 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:21.414 "strip_size_kb": 0, 00:27:21.414 "state": "online", 00:27:21.414 "raid_level": "raid1", 00:27:21.414 "superblock": true, 00:27:21.414 "num_base_bdevs": 2, 00:27:21.414 "num_base_bdevs_discovered": 1, 00:27:21.414 "num_base_bdevs_operational": 1, 00:27:21.414 "base_bdevs_list": [ 00:27:21.414 { 00:27:21.414 "name": null, 00:27:21.414 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:21.414 "is_configured": false, 00:27:21.414 "data_offset": 256, 00:27:21.414 "data_size": 7936 00:27:21.414 }, 00:27:21.414 { 00:27:21.414 "name": "BaseBdev2", 00:27:21.414 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:21.414 "is_configured": true, 00:27:21.414 "data_offset": 256, 00:27:21.414 "data_size": 7936 00:27:21.414 } 00:27:21.414 ] 00:27:21.414 }' 00:27:21.414 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:21.414 13:36:01 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:21.983 13:36:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:21.983 [2024-07-25 13:36:02.661025] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:21.983 [2024-07-25 13:36:02.661057] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:21.983 [2024-07-25 13:36:02.661072] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x9a28f0 00:27:21.983 [2024-07-25 13:36:02.661080] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:21.983 [2024-07-25 13:36:02.661255] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:21.983 [2024-07-25 13:36:02.661265] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:21.983 [2024-07-25 13:36:02.661306] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:21.983 [2024-07-25 13:36:02.661313] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:21.983 [2024-07-25 13:36:02.661319] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:21.983 [2024-07-25 13:36:02.661330] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:21.983 [2024-07-25 13:36:02.662832] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x80b620 00:27:21.983 [2024-07-25 13:36:02.663908] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:21.983 spare 00:27:21.983 13:36:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # sleep 1 00:27:22.922 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:22.922 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:22.922 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:22.922 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:22.922 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:22.922 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:22.922 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.181 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:23.181 "name": "raid_bdev1", 00:27:23.181 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:23.181 "strip_size_kb": 0, 00:27:23.181 "state": "online", 00:27:23.181 "raid_level": "raid1", 00:27:23.181 "superblock": true, 00:27:23.181 "num_base_bdevs": 2, 00:27:23.181 "num_base_bdevs_discovered": 2, 00:27:23.181 "num_base_bdevs_operational": 2, 00:27:23.181 "process": { 00:27:23.181 "type": "rebuild", 00:27:23.181 "target": "spare", 00:27:23.181 "progress": { 00:27:23.181 "blocks": 2816, 00:27:23.181 "percent": 35 00:27:23.181 } 00:27:23.181 }, 00:27:23.181 "base_bdevs_list": [ 00:27:23.181 { 00:27:23.181 "name": "spare", 00:27:23.181 "uuid": "126b8242-5250-507a-9dda-39e148801b52", 00:27:23.181 "is_configured": true, 00:27:23.181 "data_offset": 256, 00:27:23.181 "data_size": 7936 00:27:23.181 }, 00:27:23.181 { 00:27:23.181 "name": "BaseBdev2", 00:27:23.181 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:23.181 "is_configured": true, 00:27:23.181 "data_offset": 256, 00:27:23.181 "data_size": 7936 00:27:23.181 } 00:27:23.181 ] 00:27:23.181 }' 00:27:23.181 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:23.181 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:23.181 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:23.441 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:23.441 13:36:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:23.441 [2024-07-25 13:36:04.158658] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:23.441 [2024-07-25 13:36:04.172797] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:23.441 [2024-07-25 13:36:04.172824] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:23.441 [2024-07-25 13:36:04.172833] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:23.441 [2024-07-25 13:36:04.172838] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.441 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:23.702 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:23.702 "name": "raid_bdev1", 00:27:23.702 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:23.702 "strip_size_kb": 0, 00:27:23.702 "state": "online", 00:27:23.702 "raid_level": "raid1", 00:27:23.702 "superblock": true, 00:27:23.702 "num_base_bdevs": 2, 00:27:23.702 "num_base_bdevs_discovered": 1, 00:27:23.702 "num_base_bdevs_operational": 1, 00:27:23.702 "base_bdevs_list": [ 00:27:23.702 { 00:27:23.702 "name": null, 00:27:23.702 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:23.702 "is_configured": false, 00:27:23.702 "data_offset": 256, 00:27:23.702 "data_size": 7936 00:27:23.702 }, 00:27:23.702 { 00:27:23.702 "name": "BaseBdev2", 00:27:23.702 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:23.702 "is_configured": true, 00:27:23.702 "data_offset": 256, 00:27:23.702 "data_size": 7936 00:27:23.702 } 00:27:23.702 ] 00:27:23.702 }' 00:27:23.702 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:23.702 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:24.272 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:24.272 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:24.272 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:24.272 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:24.272 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:24.272 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.272 13:36:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:24.532 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:24.532 "name": "raid_bdev1", 00:27:24.532 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:24.532 "strip_size_kb": 0, 00:27:24.532 "state": "online", 00:27:24.532 "raid_level": "raid1", 00:27:24.532 "superblock": true, 00:27:24.532 "num_base_bdevs": 2, 00:27:24.532 "num_base_bdevs_discovered": 1, 00:27:24.532 "num_base_bdevs_operational": 1, 00:27:24.532 "base_bdevs_list": [ 00:27:24.532 { 00:27:24.532 "name": null, 00:27:24.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:24.532 "is_configured": false, 00:27:24.532 "data_offset": 256, 00:27:24.532 "data_size": 7936 00:27:24.532 }, 00:27:24.532 { 00:27:24.532 "name": "BaseBdev2", 00:27:24.532 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:24.532 "is_configured": true, 00:27:24.532 "data_offset": 256, 00:27:24.532 "data_size": 7936 00:27:24.532 } 00:27:24.532 ] 00:27:24.532 }' 00:27:24.532 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:24.532 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:24.532 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:24.532 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:24.532 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:24.792 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:25.052 [2024-07-25 13:36:05.598306] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:25.052 [2024-07-25 13:36:05.598336] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:25.052 [2024-07-25 13:36:05.598348] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x813050 00:27:25.052 [2024-07-25 13:36:05.598355] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:25.052 [2024-07-25 13:36:05.598505] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:25.052 [2024-07-25 13:36:05.598515] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:25.052 [2024-07-25 13:36:05.598557] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:25.052 [2024-07-25 13:36:05.598564] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:25.052 [2024-07-25 13:36:05.598570] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:25.052 BaseBdev1 00:27:25.052 13:36:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@789 -- # sleep 1 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:25.990 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.249 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.249 "name": "raid_bdev1", 00:27:26.249 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:26.249 "strip_size_kb": 0, 00:27:26.249 "state": "online", 00:27:26.249 "raid_level": "raid1", 00:27:26.249 "superblock": true, 00:27:26.249 "num_base_bdevs": 2, 00:27:26.249 "num_base_bdevs_discovered": 1, 00:27:26.249 "num_base_bdevs_operational": 1, 00:27:26.249 "base_bdevs_list": [ 00:27:26.249 { 00:27:26.249 "name": null, 00:27:26.249 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.249 "is_configured": false, 00:27:26.249 "data_offset": 256, 00:27:26.249 "data_size": 7936 00:27:26.249 }, 00:27:26.249 { 00:27:26.250 "name": "BaseBdev2", 00:27:26.250 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:26.250 "is_configured": true, 00:27:26.250 "data_offset": 256, 00:27:26.250 "data_size": 7936 00:27:26.250 } 00:27:26.250 ] 00:27:26.250 }' 00:27:26.250 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.250 13:36:06 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:26.820 "name": "raid_bdev1", 00:27:26.820 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:26.820 "strip_size_kb": 0, 00:27:26.820 "state": "online", 00:27:26.820 "raid_level": "raid1", 00:27:26.820 "superblock": true, 00:27:26.820 "num_base_bdevs": 2, 00:27:26.820 "num_base_bdevs_discovered": 1, 00:27:26.820 "num_base_bdevs_operational": 1, 00:27:26.820 "base_bdevs_list": [ 00:27:26.820 { 00:27:26.820 "name": null, 00:27:26.820 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:26.820 "is_configured": false, 00:27:26.820 "data_offset": 256, 00:27:26.820 "data_size": 7936 00:27:26.820 }, 00:27:26.820 { 00:27:26.820 "name": "BaseBdev2", 00:27:26.820 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:26.820 "is_configured": true, 00:27:26.820 "data_offset": 256, 00:27:26.820 "data_size": 7936 00:27:26.820 } 00:27:26.820 ] 00:27:26.820 }' 00:27:26.820 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@650 -- # local es=0 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:27.080 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:27.080 [2024-07-25 13:36:07.860038] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:27.080 [2024-07-25 13:36:07.860129] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:27.080 [2024-07-25 13:36:07.860137] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:27.080 request: 00:27:27.080 { 00:27:27.080 "base_bdev": "BaseBdev1", 00:27:27.080 "raid_bdev": "raid_bdev1", 00:27:27.080 "method": "bdev_raid_add_base_bdev", 00:27:27.080 "req_id": 1 00:27:27.080 } 00:27:27.080 Got JSON-RPC error response 00:27:27.080 response: 00:27:27.080 { 00:27:27.080 "code": -22, 00:27:27.080 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:27.080 } 00:27:27.340 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@653 -- # es=1 00:27:27.340 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:27.340 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:27.340 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:27.340 13:36:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@793 -- # sleep 1 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:28.281 13:36:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:28.542 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:28.542 "name": "raid_bdev1", 00:27:28.542 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:28.542 "strip_size_kb": 0, 00:27:28.542 "state": "online", 00:27:28.542 "raid_level": "raid1", 00:27:28.542 "superblock": true, 00:27:28.542 "num_base_bdevs": 2, 00:27:28.542 "num_base_bdevs_discovered": 1, 00:27:28.542 "num_base_bdevs_operational": 1, 00:27:28.542 "base_bdevs_list": [ 00:27:28.542 { 00:27:28.542 "name": null, 00:27:28.542 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:28.542 "is_configured": false, 00:27:28.542 "data_offset": 256, 00:27:28.542 "data_size": 7936 00:27:28.542 }, 00:27:28.542 { 00:27:28.542 "name": "BaseBdev2", 00:27:28.542 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:28.542 "is_configured": true, 00:27:28.542 "data_offset": 256, 00:27:28.542 "data_size": 7936 00:27:28.542 } 00:27:28.542 ] 00:27:28.542 }' 00:27:28.542 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:28.542 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:29.112 "name": "raid_bdev1", 00:27:29.112 "uuid": "c973c78f-62b3-4a8b-a7ff-0cf3030cda20", 00:27:29.112 "strip_size_kb": 0, 00:27:29.112 "state": "online", 00:27:29.112 "raid_level": "raid1", 00:27:29.112 "superblock": true, 00:27:29.112 "num_base_bdevs": 2, 00:27:29.112 "num_base_bdevs_discovered": 1, 00:27:29.112 "num_base_bdevs_operational": 1, 00:27:29.112 "base_bdevs_list": [ 00:27:29.112 { 00:27:29.112 "name": null, 00:27:29.112 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.112 "is_configured": false, 00:27:29.112 "data_offset": 256, 00:27:29.112 "data_size": 7936 00:27:29.112 }, 00:27:29.112 { 00:27:29.112 "name": "BaseBdev2", 00:27:29.112 "uuid": "9741f160-c66b-5504-b991-99e9ab3afb4d", 00:27:29.112 "is_configured": true, 00:27:29.112 "data_offset": 256, 00:27:29.112 "data_size": 7936 00:27:29.112 } 00:27:29.112 ] 00:27:29.112 }' 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:29.112 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@798 -- # killprocess 1048297 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@950 -- # '[' -z 1048297 ']' 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # kill -0 1048297 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # uname 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1048297 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1048297' 00:27:29.372 killing process with pid 1048297 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@969 -- # kill 1048297 00:27:29.372 Received shutdown signal, test time was about 60.000000 seconds 00:27:29.372 00:27:29.372 Latency(us) 00:27:29.372 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:29.372 =================================================================================================================== 00:27:29.372 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:29.372 [2024-07-25 13:36:09.967200] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:29.372 [2024-07-25 13:36:09.967261] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:29.372 [2024-07-25 13:36:09.967289] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:29.372 [2024-07-25 13:36:09.967296] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x80f590 name raid_bdev1, state offline 00:27:29.372 13:36:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@974 -- # wait 1048297 00:27:29.372 [2024-07-25 13:36:09.985994] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:29.372 13:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@800 -- # return 0 00:27:29.372 00:27:29.372 real 0m29.946s 00:27:29.372 user 0m47.730s 00:27:29.372 sys 0m3.792s 00:27:29.372 13:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:29.372 13:36:10 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:29.372 ************************************ 00:27:29.372 END TEST raid_rebuild_test_sb_md_separate 00:27:29.372 ************************************ 00:27:29.372 13:36:10 bdev_raid -- bdev/bdev_raid.sh@991 -- # base_malloc_params='-m 32 -i' 00:27:29.372 13:36:10 bdev_raid -- bdev/bdev_raid.sh@992 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:27:29.372 13:36:10 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:27:29.372 13:36:10 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:29.372 13:36:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:29.633 ************************************ 00:27:29.633 START TEST raid_state_function_test_sb_md_interleaved 00:27:29.633 ************************************ 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_state_function_test raid1 2 true 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=1053993 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 1053993' 00:27:29.633 Process raid pid: 1053993 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 1053993 /var/tmp/spdk-raid.sock 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1053993 ']' 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:29.633 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:29.633 13:36:10 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:29.633 [2024-07-25 13:36:10.245792] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:27:29.633 [2024-07-25 13:36:10.245840] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:29.633 [2024-07-25 13:36:10.334616] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:29.633 [2024-07-25 13:36:10.398882] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:29.894 [2024-07-25 13:36:10.451861] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:29.894 [2024-07-25 13:36:10.451887] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:30.464 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:30.464 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:27:30.464 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:30.724 [2024-07-25 13:36:11.259718] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:30.724 [2024-07-25 13:36:11.259749] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:30.724 [2024-07-25 13:36:11.259755] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:30.724 [2024-07-25 13:36:11.259761] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:30.724 "name": "Existed_Raid", 00:27:30.724 "uuid": "4345fd54-684a-4989-baaf-79e83b66842d", 00:27:30.724 "strip_size_kb": 0, 00:27:30.724 "state": "configuring", 00:27:30.724 "raid_level": "raid1", 00:27:30.724 "superblock": true, 00:27:30.724 "num_base_bdevs": 2, 00:27:30.724 "num_base_bdevs_discovered": 0, 00:27:30.724 "num_base_bdevs_operational": 2, 00:27:30.724 "base_bdevs_list": [ 00:27:30.724 { 00:27:30.724 "name": "BaseBdev1", 00:27:30.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.724 "is_configured": false, 00:27:30.724 "data_offset": 0, 00:27:30.724 "data_size": 0 00:27:30.724 }, 00:27:30.724 { 00:27:30.724 "name": "BaseBdev2", 00:27:30.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:30.724 "is_configured": false, 00:27:30.724 "data_offset": 0, 00:27:30.724 "data_size": 0 00:27:30.724 } 00:27:30.724 ] 00:27:30.724 }' 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:30.724 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:31.294 13:36:11 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:31.554 [2024-07-25 13:36:12.165899] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:31.554 [2024-07-25 13:36:12.165920] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22066b0 name Existed_Raid, state configuring 00:27:31.554 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:31.554 [2024-07-25 13:36:12.342360] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:31.554 [2024-07-25 13:36:12.342379] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:31.554 [2024-07-25 13:36:12.342384] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:31.554 [2024-07-25 13:36:12.342389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:27:31.814 [2024-07-25 13:36:12.525443] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:31.814 BaseBdev1 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev1 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:27:31.814 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:32.074 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:32.334 [ 00:27:32.334 { 00:27:32.334 "name": "BaseBdev1", 00:27:32.334 "aliases": [ 00:27:32.334 "67d4f9f6-cd33-476c-b5d6-eef2af973339" 00:27:32.334 ], 00:27:32.334 "product_name": "Malloc disk", 00:27:32.334 "block_size": 4128, 00:27:32.334 "num_blocks": 8192, 00:27:32.334 "uuid": "67d4f9f6-cd33-476c-b5d6-eef2af973339", 00:27:32.334 "md_size": 32, 00:27:32.334 "md_interleave": true, 00:27:32.334 "dif_type": 0, 00:27:32.334 "assigned_rate_limits": { 00:27:32.334 "rw_ios_per_sec": 0, 00:27:32.334 "rw_mbytes_per_sec": 0, 00:27:32.334 "r_mbytes_per_sec": 0, 00:27:32.334 "w_mbytes_per_sec": 0 00:27:32.334 }, 00:27:32.334 "claimed": true, 00:27:32.334 "claim_type": "exclusive_write", 00:27:32.334 "zoned": false, 00:27:32.334 "supported_io_types": { 00:27:32.334 "read": true, 00:27:32.334 "write": true, 00:27:32.334 "unmap": true, 00:27:32.334 "flush": true, 00:27:32.334 "reset": true, 00:27:32.334 "nvme_admin": false, 00:27:32.334 "nvme_io": false, 00:27:32.334 "nvme_io_md": false, 00:27:32.334 "write_zeroes": true, 00:27:32.334 "zcopy": true, 00:27:32.334 "get_zone_info": false, 00:27:32.334 "zone_management": false, 00:27:32.334 "zone_append": false, 00:27:32.334 "compare": false, 00:27:32.334 "compare_and_write": false, 00:27:32.334 "abort": true, 00:27:32.334 "seek_hole": false, 00:27:32.334 "seek_data": false, 00:27:32.334 "copy": true, 00:27:32.334 "nvme_iov_md": false 00:27:32.334 }, 00:27:32.334 "memory_domains": [ 00:27:32.334 { 00:27:32.334 "dma_device_id": "system", 00:27:32.334 "dma_device_type": 1 00:27:32.334 }, 00:27:32.334 { 00:27:32.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:32.334 "dma_device_type": 2 00:27:32.334 } 00:27:32.334 ], 00:27:32.334 "driver_specific": {} 00:27:32.334 } 00:27:32.334 ] 00:27:32.334 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:27:32.334 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:32.334 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.335 13:36:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:32.335 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.335 "name": "Existed_Raid", 00:27:32.335 "uuid": "99e60d75-d1c0-4e1a-97b2-071f00e97e88", 00:27:32.335 "strip_size_kb": 0, 00:27:32.335 "state": "configuring", 00:27:32.335 "raid_level": "raid1", 00:27:32.335 "superblock": true, 00:27:32.335 "num_base_bdevs": 2, 00:27:32.335 "num_base_bdevs_discovered": 1, 00:27:32.335 "num_base_bdevs_operational": 2, 00:27:32.335 "base_bdevs_list": [ 00:27:32.335 { 00:27:32.335 "name": "BaseBdev1", 00:27:32.335 "uuid": "67d4f9f6-cd33-476c-b5d6-eef2af973339", 00:27:32.335 "is_configured": true, 00:27:32.335 "data_offset": 256, 00:27:32.335 "data_size": 7936 00:27:32.335 }, 00:27:32.335 { 00:27:32.335 "name": "BaseBdev2", 00:27:32.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.335 "is_configured": false, 00:27:32.335 "data_offset": 0, 00:27:32.335 "data_size": 0 00:27:32.335 } 00:27:32.335 ] 00:27:32.335 }' 00:27:32.335 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.335 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:32.904 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:33.164 [2024-07-25 13:36:13.784646] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:33.164 [2024-07-25 13:36:13.784671] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2205fa0 name Existed_Raid, state configuring 00:27:33.164 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:33.164 [2024-07-25 13:36:13.937058] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:33.164 [2024-07-25 13:36:13.938191] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:33.164 [2024-07-25 13:36:13.938212] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:33.424 13:36:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:33.424 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:33.424 "name": "Existed_Raid", 00:27:33.424 "uuid": "d7b5ba0b-550a-46db-92e2-1b857a73a874", 00:27:33.424 "strip_size_kb": 0, 00:27:33.424 "state": "configuring", 00:27:33.424 "raid_level": "raid1", 00:27:33.424 "superblock": true, 00:27:33.424 "num_base_bdevs": 2, 00:27:33.424 "num_base_bdevs_discovered": 1, 00:27:33.424 "num_base_bdevs_operational": 2, 00:27:33.424 "base_bdevs_list": [ 00:27:33.424 { 00:27:33.424 "name": "BaseBdev1", 00:27:33.424 "uuid": "67d4f9f6-cd33-476c-b5d6-eef2af973339", 00:27:33.424 "is_configured": true, 00:27:33.424 "data_offset": 256, 00:27:33.424 "data_size": 7936 00:27:33.424 }, 00:27:33.424 { 00:27:33.424 "name": "BaseBdev2", 00:27:33.424 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:33.424 "is_configured": false, 00:27:33.424 "data_offset": 0, 00:27:33.424 "data_size": 0 00:27:33.424 } 00:27:33.424 ] 00:27:33.424 }' 00:27:33.424 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:33.424 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:33.994 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:27:34.253 [2024-07-25 13:36:14.900595] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:34.253 [2024-07-25 13:36:14.900694] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x22056e0 00:27:34.253 [2024-07-25 13:36:14.900701] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:34.253 [2024-07-25 13:36:14.900743] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23a3550 00:27:34.253 [2024-07-25 13:36:14.900801] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x22056e0 00:27:34.253 [2024-07-25 13:36:14.900807] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x22056e0 00:27:34.253 [2024-07-25 13:36:14.900848] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:34.253 BaseBdev2 00:27:34.253 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:34.254 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_name=BaseBdev2 00:27:34.254 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:27:34.254 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # local i 00:27:34.254 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:27:34.254 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:27:34.254 13:36:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:34.514 [ 00:27:34.514 { 00:27:34.514 "name": "BaseBdev2", 00:27:34.514 "aliases": [ 00:27:34.514 "7d6c69f5-d68b-45b0-8208-d9001df0a064" 00:27:34.514 ], 00:27:34.514 "product_name": "Malloc disk", 00:27:34.514 "block_size": 4128, 00:27:34.514 "num_blocks": 8192, 00:27:34.514 "uuid": "7d6c69f5-d68b-45b0-8208-d9001df0a064", 00:27:34.514 "md_size": 32, 00:27:34.514 "md_interleave": true, 00:27:34.514 "dif_type": 0, 00:27:34.514 "assigned_rate_limits": { 00:27:34.514 "rw_ios_per_sec": 0, 00:27:34.514 "rw_mbytes_per_sec": 0, 00:27:34.514 "r_mbytes_per_sec": 0, 00:27:34.514 "w_mbytes_per_sec": 0 00:27:34.514 }, 00:27:34.514 "claimed": true, 00:27:34.514 "claim_type": "exclusive_write", 00:27:34.514 "zoned": false, 00:27:34.514 "supported_io_types": { 00:27:34.514 "read": true, 00:27:34.514 "write": true, 00:27:34.514 "unmap": true, 00:27:34.514 "flush": true, 00:27:34.514 "reset": true, 00:27:34.514 "nvme_admin": false, 00:27:34.514 "nvme_io": false, 00:27:34.514 "nvme_io_md": false, 00:27:34.514 "write_zeroes": true, 00:27:34.514 "zcopy": true, 00:27:34.514 "get_zone_info": false, 00:27:34.514 "zone_management": false, 00:27:34.514 "zone_append": false, 00:27:34.514 "compare": false, 00:27:34.514 "compare_and_write": false, 00:27:34.514 "abort": true, 00:27:34.514 "seek_hole": false, 00:27:34.514 "seek_data": false, 00:27:34.514 "copy": true, 00:27:34.514 "nvme_iov_md": false 00:27:34.514 }, 00:27:34.514 "memory_domains": [ 00:27:34.514 { 00:27:34.514 "dma_device_id": "system", 00:27:34.514 "dma_device_type": 1 00:27:34.514 }, 00:27:34.514 { 00:27:34.514 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:34.514 "dma_device_type": 2 00:27:34.514 } 00:27:34.514 ], 00:27:34.514 "driver_specific": {} 00:27:34.514 } 00:27:34.514 ] 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@907 -- # return 0 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:34.514 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:34.774 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:34.774 "name": "Existed_Raid", 00:27:34.774 "uuid": "d7b5ba0b-550a-46db-92e2-1b857a73a874", 00:27:34.774 "strip_size_kb": 0, 00:27:34.774 "state": "online", 00:27:34.774 "raid_level": "raid1", 00:27:34.774 "superblock": true, 00:27:34.774 "num_base_bdevs": 2, 00:27:34.774 "num_base_bdevs_discovered": 2, 00:27:34.774 "num_base_bdevs_operational": 2, 00:27:34.774 "base_bdevs_list": [ 00:27:34.774 { 00:27:34.774 "name": "BaseBdev1", 00:27:34.774 "uuid": "67d4f9f6-cd33-476c-b5d6-eef2af973339", 00:27:34.774 "is_configured": true, 00:27:34.774 "data_offset": 256, 00:27:34.774 "data_size": 7936 00:27:34.774 }, 00:27:34.774 { 00:27:34.774 "name": "BaseBdev2", 00:27:34.774 "uuid": "7d6c69f5-d68b-45b0-8208-d9001df0a064", 00:27:34.774 "is_configured": true, 00:27:34.774 "data_offset": 256, 00:27:34.774 "data_size": 7936 00:27:34.774 } 00:27:34.774 ] 00:27:34.774 }' 00:27:34.774 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:34.774 13:36:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:35.343 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:35.603 [2024-07-25 13:36:16.204125] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:35.603 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:35.603 "name": "Existed_Raid", 00:27:35.603 "aliases": [ 00:27:35.603 "d7b5ba0b-550a-46db-92e2-1b857a73a874" 00:27:35.603 ], 00:27:35.603 "product_name": "Raid Volume", 00:27:35.603 "block_size": 4128, 00:27:35.603 "num_blocks": 7936, 00:27:35.603 "uuid": "d7b5ba0b-550a-46db-92e2-1b857a73a874", 00:27:35.603 "md_size": 32, 00:27:35.603 "md_interleave": true, 00:27:35.603 "dif_type": 0, 00:27:35.603 "assigned_rate_limits": { 00:27:35.603 "rw_ios_per_sec": 0, 00:27:35.603 "rw_mbytes_per_sec": 0, 00:27:35.603 "r_mbytes_per_sec": 0, 00:27:35.603 "w_mbytes_per_sec": 0 00:27:35.603 }, 00:27:35.603 "claimed": false, 00:27:35.603 "zoned": false, 00:27:35.603 "supported_io_types": { 00:27:35.603 "read": true, 00:27:35.603 "write": true, 00:27:35.603 "unmap": false, 00:27:35.603 "flush": false, 00:27:35.603 "reset": true, 00:27:35.603 "nvme_admin": false, 00:27:35.603 "nvme_io": false, 00:27:35.603 "nvme_io_md": false, 00:27:35.603 "write_zeroes": true, 00:27:35.603 "zcopy": false, 00:27:35.603 "get_zone_info": false, 00:27:35.603 "zone_management": false, 00:27:35.603 "zone_append": false, 00:27:35.603 "compare": false, 00:27:35.603 "compare_and_write": false, 00:27:35.603 "abort": false, 00:27:35.603 "seek_hole": false, 00:27:35.603 "seek_data": false, 00:27:35.603 "copy": false, 00:27:35.603 "nvme_iov_md": false 00:27:35.603 }, 00:27:35.603 "memory_domains": [ 00:27:35.603 { 00:27:35.603 "dma_device_id": "system", 00:27:35.603 "dma_device_type": 1 00:27:35.603 }, 00:27:35.603 { 00:27:35.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.603 "dma_device_type": 2 00:27:35.603 }, 00:27:35.603 { 00:27:35.603 "dma_device_id": "system", 00:27:35.603 "dma_device_type": 1 00:27:35.603 }, 00:27:35.603 { 00:27:35.603 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.603 "dma_device_type": 2 00:27:35.603 } 00:27:35.603 ], 00:27:35.603 "driver_specific": { 00:27:35.603 "raid": { 00:27:35.603 "uuid": "d7b5ba0b-550a-46db-92e2-1b857a73a874", 00:27:35.603 "strip_size_kb": 0, 00:27:35.603 "state": "online", 00:27:35.603 "raid_level": "raid1", 00:27:35.603 "superblock": true, 00:27:35.603 "num_base_bdevs": 2, 00:27:35.603 "num_base_bdevs_discovered": 2, 00:27:35.603 "num_base_bdevs_operational": 2, 00:27:35.603 "base_bdevs_list": [ 00:27:35.603 { 00:27:35.603 "name": "BaseBdev1", 00:27:35.603 "uuid": "67d4f9f6-cd33-476c-b5d6-eef2af973339", 00:27:35.603 "is_configured": true, 00:27:35.603 "data_offset": 256, 00:27:35.603 "data_size": 7936 00:27:35.603 }, 00:27:35.603 { 00:27:35.603 "name": "BaseBdev2", 00:27:35.603 "uuid": "7d6c69f5-d68b-45b0-8208-d9001df0a064", 00:27:35.603 "is_configured": true, 00:27:35.603 "data_offset": 256, 00:27:35.603 "data_size": 7936 00:27:35.603 } 00:27:35.603 ] 00:27:35.603 } 00:27:35.603 } 00:27:35.603 }' 00:27:35.603 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:35.603 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:35.603 BaseBdev2' 00:27:35.603 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:35.603 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:35.603 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:35.863 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:35.863 "name": "BaseBdev1", 00:27:35.863 "aliases": [ 00:27:35.863 "67d4f9f6-cd33-476c-b5d6-eef2af973339" 00:27:35.863 ], 00:27:35.863 "product_name": "Malloc disk", 00:27:35.863 "block_size": 4128, 00:27:35.863 "num_blocks": 8192, 00:27:35.863 "uuid": "67d4f9f6-cd33-476c-b5d6-eef2af973339", 00:27:35.863 "md_size": 32, 00:27:35.863 "md_interleave": true, 00:27:35.863 "dif_type": 0, 00:27:35.863 "assigned_rate_limits": { 00:27:35.863 "rw_ios_per_sec": 0, 00:27:35.863 "rw_mbytes_per_sec": 0, 00:27:35.863 "r_mbytes_per_sec": 0, 00:27:35.863 "w_mbytes_per_sec": 0 00:27:35.863 }, 00:27:35.863 "claimed": true, 00:27:35.863 "claim_type": "exclusive_write", 00:27:35.863 "zoned": false, 00:27:35.863 "supported_io_types": { 00:27:35.863 "read": true, 00:27:35.863 "write": true, 00:27:35.863 "unmap": true, 00:27:35.863 "flush": true, 00:27:35.863 "reset": true, 00:27:35.863 "nvme_admin": false, 00:27:35.863 "nvme_io": false, 00:27:35.863 "nvme_io_md": false, 00:27:35.863 "write_zeroes": true, 00:27:35.863 "zcopy": true, 00:27:35.863 "get_zone_info": false, 00:27:35.863 "zone_management": false, 00:27:35.863 "zone_append": false, 00:27:35.863 "compare": false, 00:27:35.863 "compare_and_write": false, 00:27:35.863 "abort": true, 00:27:35.863 "seek_hole": false, 00:27:35.863 "seek_data": false, 00:27:35.863 "copy": true, 00:27:35.863 "nvme_iov_md": false 00:27:35.863 }, 00:27:35.863 "memory_domains": [ 00:27:35.863 { 00:27:35.863 "dma_device_id": "system", 00:27:35.863 "dma_device_type": 1 00:27:35.863 }, 00:27:35.863 { 00:27:35.863 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:35.863 "dma_device_type": 2 00:27:35.863 } 00:27:35.863 ], 00:27:35.863 "driver_specific": {} 00:27:35.863 }' 00:27:35.863 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:35.863 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:35.863 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:35.863 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:35.863 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:35.864 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:35.864 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:36.124 13:36:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:36.383 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:36.383 "name": "BaseBdev2", 00:27:36.383 "aliases": [ 00:27:36.383 "7d6c69f5-d68b-45b0-8208-d9001df0a064" 00:27:36.383 ], 00:27:36.383 "product_name": "Malloc disk", 00:27:36.383 "block_size": 4128, 00:27:36.383 "num_blocks": 8192, 00:27:36.384 "uuid": "7d6c69f5-d68b-45b0-8208-d9001df0a064", 00:27:36.384 "md_size": 32, 00:27:36.384 "md_interleave": true, 00:27:36.384 "dif_type": 0, 00:27:36.384 "assigned_rate_limits": { 00:27:36.384 "rw_ios_per_sec": 0, 00:27:36.384 "rw_mbytes_per_sec": 0, 00:27:36.384 "r_mbytes_per_sec": 0, 00:27:36.384 "w_mbytes_per_sec": 0 00:27:36.384 }, 00:27:36.384 "claimed": true, 00:27:36.384 "claim_type": "exclusive_write", 00:27:36.384 "zoned": false, 00:27:36.384 "supported_io_types": { 00:27:36.384 "read": true, 00:27:36.384 "write": true, 00:27:36.384 "unmap": true, 00:27:36.384 "flush": true, 00:27:36.384 "reset": true, 00:27:36.384 "nvme_admin": false, 00:27:36.384 "nvme_io": false, 00:27:36.384 "nvme_io_md": false, 00:27:36.384 "write_zeroes": true, 00:27:36.384 "zcopy": true, 00:27:36.384 "get_zone_info": false, 00:27:36.384 "zone_management": false, 00:27:36.384 "zone_append": false, 00:27:36.384 "compare": false, 00:27:36.384 "compare_and_write": false, 00:27:36.384 "abort": true, 00:27:36.384 "seek_hole": false, 00:27:36.384 "seek_data": false, 00:27:36.384 "copy": true, 00:27:36.384 "nvme_iov_md": false 00:27:36.384 }, 00:27:36.384 "memory_domains": [ 00:27:36.384 { 00:27:36.384 "dma_device_id": "system", 00:27:36.384 "dma_device_type": 1 00:27:36.384 }, 00:27:36.384 { 00:27:36.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:36.384 "dma_device_type": 2 00:27:36.384 } 00:27:36.384 ], 00:27:36.384 "driver_specific": {} 00:27:36.384 }' 00:27:36.384 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.384 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:36.384 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:36.384 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.384 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:36.384 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:36.384 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.643 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:36.643 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:36.643 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.643 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:36.643 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:36.643 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:36.903 [2024-07-25 13:36:17.531395] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:36.903 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:37.163 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.163 "name": "Existed_Raid", 00:27:37.163 "uuid": "d7b5ba0b-550a-46db-92e2-1b857a73a874", 00:27:37.163 "strip_size_kb": 0, 00:27:37.163 "state": "online", 00:27:37.163 "raid_level": "raid1", 00:27:37.163 "superblock": true, 00:27:37.163 "num_base_bdevs": 2, 00:27:37.163 "num_base_bdevs_discovered": 1, 00:27:37.163 "num_base_bdevs_operational": 1, 00:27:37.163 "base_bdevs_list": [ 00:27:37.163 { 00:27:37.163 "name": null, 00:27:37.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:37.163 "is_configured": false, 00:27:37.163 "data_offset": 256, 00:27:37.163 "data_size": 7936 00:27:37.163 }, 00:27:37.163 { 00:27:37.163 "name": "BaseBdev2", 00:27:37.163 "uuid": "7d6c69f5-d68b-45b0-8208-d9001df0a064", 00:27:37.163 "is_configured": true, 00:27:37.163 "data_offset": 256, 00:27:37.163 "data_size": 7936 00:27:37.163 } 00:27:37.163 ] 00:27:37.163 }' 00:27:37.163 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.163 13:36:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:37.733 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:37.733 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:37.733 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.733 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:37.733 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:37.733 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:37.733 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:37.993 [2024-07-25 13:36:18.674280] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:37.993 [2024-07-25 13:36:18.674343] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:37.993 [2024-07-25 13:36:18.680623] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:37.993 [2024-07-25 13:36:18.680647] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:37.993 [2024-07-25 13:36:18.680653] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x22056e0 name Existed_Raid, state offline 00:27:37.993 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:37.993 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:37.993 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.993 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:38.252 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:38.252 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:38.252 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 1053993 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1053993 ']' 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1053993 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1053993 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1053993' 00:27:38.253 killing process with pid 1053993 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1053993 00:27:38.253 [2024-07-25 13:36:18.939677] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:38.253 13:36:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1053993 00:27:38.253 [2024-07-25 13:36:18.940258] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:38.513 13:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:27:38.513 00:27:38.513 real 0m8.871s 00:27:38.513 user 0m16.068s 00:27:38.513 sys 0m1.397s 00:27:38.513 13:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:38.513 13:36:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:38.513 ************************************ 00:27:38.513 END TEST raid_state_function_test_sb_md_interleaved 00:27:38.513 ************************************ 00:27:38.513 13:36:19 bdev_raid -- bdev/bdev_raid.sh@993 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:27:38.513 13:36:19 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 4 -le 1 ']' 00:27:38.513 13:36:19 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:38.513 13:36:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:38.513 ************************************ 00:27:38.513 START TEST raid_superblock_test_md_interleaved 00:27:38.513 ************************************ 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # raid_superblock_test raid1 2 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@408 -- # local raid_level=raid1 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@409 -- # local num_base_bdevs=2 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # base_bdevs_malloc=() 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # local base_bdevs_malloc 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # base_bdevs_pt=() 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # local base_bdevs_pt 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # base_bdevs_pt_uuid=() 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # local base_bdevs_pt_uuid 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@413 -- # local raid_bdev_name=raid_bdev1 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@414 -- # local strip_size 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # local strip_size_create_arg 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local raid_bdev_uuid 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local raid_bdev 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@419 -- # '[' raid1 '!=' raid1 ']' 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@423 -- # strip_size=0 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@427 -- # raid_pid=1055965 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@428 -- # waitforlisten 1055965 /var/tmp/spdk-raid.sock 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1055965 ']' 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@426 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:38.513 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:38.513 13:36:19 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:38.513 [2024-07-25 13:36:19.194132] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:27:38.513 [2024-07-25 13:36:19.194186] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1055965 ] 00:27:38.513 [2024-07-25 13:36:19.284275] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:38.773 [2024-07-25 13:36:19.352735] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:38.773 [2024-07-25 13:36:19.394644] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:38.773 [2024-07-25 13:36:19.394668] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:39.394 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i = 1 )) 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc1 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt1 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:39.395 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:27:39.655 malloc1 00:27:39.655 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:39.655 [2024-07-25 13:36:20.389109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:39.655 [2024-07-25 13:36:20.389144] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:39.655 [2024-07-25 13:36:20.389155] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x210c920 00:27:39.655 [2024-07-25 13:36:20.389161] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:39.655 [2024-07-25 13:36:20.390300] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:39.655 [2024-07-25 13:36:20.390321] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:39.655 pt1 00:27:39.655 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:27:39.655 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:39.655 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@432 -- # local bdev_malloc=malloc2 00:27:39.656 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@433 -- # local bdev_pt=pt2 00:27:39.656 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:39.656 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@436 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:39.656 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@437 -- # base_bdevs_pt+=($bdev_pt) 00:27:39.656 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@438 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:39.656 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:27:39.916 malloc2 00:27:39.916 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:40.177 [2024-07-25 13:36:20.756185] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:40.177 [2024-07-25 13:36:20.756213] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:40.177 [2024-07-25 13:36:20.756223] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f19b0 00:27:40.177 [2024-07-25 13:36:20.756229] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:40.177 [2024-07-25 13:36:20.757330] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:40.177 [2024-07-25 13:36:20.757348] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:40.177 pt2 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i++ )) 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # (( i <= num_base_bdevs )) 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:40.177 [2024-07-25 13:36:20.944665] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:40.177 [2024-07-25 13:36:20.945736] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:40.177 [2024-07-25 13:36:20.945839] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f3360 00:27:40.177 [2024-07-25 13:36:20.945847] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:40.177 [2024-07-25 13:36:20.945894] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f6130 00:27:40.177 [2024-07-25 13:36:20.945956] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f3360 00:27:40.177 [2024-07-25 13:36:20.945961] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f3360 00:27:40.177 [2024-07-25 13:36:20.946011] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@446 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:40.177 13:36:20 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:40.437 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:40.438 "name": "raid_bdev1", 00:27:40.438 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:40.438 "strip_size_kb": 0, 00:27:40.438 "state": "online", 00:27:40.438 "raid_level": "raid1", 00:27:40.438 "superblock": true, 00:27:40.438 "num_base_bdevs": 2, 00:27:40.438 "num_base_bdevs_discovered": 2, 00:27:40.438 "num_base_bdevs_operational": 2, 00:27:40.438 "base_bdevs_list": [ 00:27:40.438 { 00:27:40.438 "name": "pt1", 00:27:40.438 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:40.438 "is_configured": true, 00:27:40.438 "data_offset": 256, 00:27:40.438 "data_size": 7936 00:27:40.438 }, 00:27:40.438 { 00:27:40.438 "name": "pt2", 00:27:40.438 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:40.438 "is_configured": true, 00:27:40.438 "data_offset": 256, 00:27:40.438 "data_size": 7936 00:27:40.438 } 00:27:40.438 ] 00:27:40.438 }' 00:27:40.438 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:40.438 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # verify_raid_bdev_properties raid_bdev1 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:41.008 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:41.269 [2024-07-25 13:36:21.871194] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:41.269 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:41.269 "name": "raid_bdev1", 00:27:41.269 "aliases": [ 00:27:41.269 "d5a0f01b-c195-4712-a026-3ad8d732b3dd" 00:27:41.269 ], 00:27:41.269 "product_name": "Raid Volume", 00:27:41.269 "block_size": 4128, 00:27:41.269 "num_blocks": 7936, 00:27:41.269 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:41.269 "md_size": 32, 00:27:41.269 "md_interleave": true, 00:27:41.269 "dif_type": 0, 00:27:41.269 "assigned_rate_limits": { 00:27:41.269 "rw_ios_per_sec": 0, 00:27:41.269 "rw_mbytes_per_sec": 0, 00:27:41.269 "r_mbytes_per_sec": 0, 00:27:41.269 "w_mbytes_per_sec": 0 00:27:41.269 }, 00:27:41.269 "claimed": false, 00:27:41.269 "zoned": false, 00:27:41.269 "supported_io_types": { 00:27:41.269 "read": true, 00:27:41.269 "write": true, 00:27:41.269 "unmap": false, 00:27:41.269 "flush": false, 00:27:41.269 "reset": true, 00:27:41.269 "nvme_admin": false, 00:27:41.269 "nvme_io": false, 00:27:41.269 "nvme_io_md": false, 00:27:41.269 "write_zeroes": true, 00:27:41.269 "zcopy": false, 00:27:41.269 "get_zone_info": false, 00:27:41.269 "zone_management": false, 00:27:41.269 "zone_append": false, 00:27:41.269 "compare": false, 00:27:41.269 "compare_and_write": false, 00:27:41.269 "abort": false, 00:27:41.269 "seek_hole": false, 00:27:41.269 "seek_data": false, 00:27:41.269 "copy": false, 00:27:41.269 "nvme_iov_md": false 00:27:41.269 }, 00:27:41.269 "memory_domains": [ 00:27:41.269 { 00:27:41.269 "dma_device_id": "system", 00:27:41.269 "dma_device_type": 1 00:27:41.269 }, 00:27:41.269 { 00:27:41.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.269 "dma_device_type": 2 00:27:41.269 }, 00:27:41.269 { 00:27:41.269 "dma_device_id": "system", 00:27:41.269 "dma_device_type": 1 00:27:41.269 }, 00:27:41.269 { 00:27:41.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.269 "dma_device_type": 2 00:27:41.269 } 00:27:41.269 ], 00:27:41.269 "driver_specific": { 00:27:41.269 "raid": { 00:27:41.269 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:41.269 "strip_size_kb": 0, 00:27:41.269 "state": "online", 00:27:41.269 "raid_level": "raid1", 00:27:41.269 "superblock": true, 00:27:41.269 "num_base_bdevs": 2, 00:27:41.269 "num_base_bdevs_discovered": 2, 00:27:41.269 "num_base_bdevs_operational": 2, 00:27:41.269 "base_bdevs_list": [ 00:27:41.269 { 00:27:41.269 "name": "pt1", 00:27:41.269 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:41.269 "is_configured": true, 00:27:41.269 "data_offset": 256, 00:27:41.269 "data_size": 7936 00:27:41.269 }, 00:27:41.269 { 00:27:41.269 "name": "pt2", 00:27:41.269 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:41.269 "is_configured": true, 00:27:41.269 "data_offset": 256, 00:27:41.269 "data_size": 7936 00:27:41.269 } 00:27:41.269 ] 00:27:41.269 } 00:27:41.269 } 00:27:41.269 }' 00:27:41.269 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:41.269 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:41.269 pt2' 00:27:41.269 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:41.269 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:41.269 13:36:21 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:41.530 "name": "pt1", 00:27:41.530 "aliases": [ 00:27:41.530 "00000000-0000-0000-0000-000000000001" 00:27:41.530 ], 00:27:41.530 "product_name": "passthru", 00:27:41.530 "block_size": 4128, 00:27:41.530 "num_blocks": 8192, 00:27:41.530 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:41.530 "md_size": 32, 00:27:41.530 "md_interleave": true, 00:27:41.530 "dif_type": 0, 00:27:41.530 "assigned_rate_limits": { 00:27:41.530 "rw_ios_per_sec": 0, 00:27:41.530 "rw_mbytes_per_sec": 0, 00:27:41.530 "r_mbytes_per_sec": 0, 00:27:41.530 "w_mbytes_per_sec": 0 00:27:41.530 }, 00:27:41.530 "claimed": true, 00:27:41.530 "claim_type": "exclusive_write", 00:27:41.530 "zoned": false, 00:27:41.530 "supported_io_types": { 00:27:41.530 "read": true, 00:27:41.530 "write": true, 00:27:41.530 "unmap": true, 00:27:41.530 "flush": true, 00:27:41.530 "reset": true, 00:27:41.530 "nvme_admin": false, 00:27:41.530 "nvme_io": false, 00:27:41.530 "nvme_io_md": false, 00:27:41.530 "write_zeroes": true, 00:27:41.530 "zcopy": true, 00:27:41.530 "get_zone_info": false, 00:27:41.530 "zone_management": false, 00:27:41.530 "zone_append": false, 00:27:41.530 "compare": false, 00:27:41.530 "compare_and_write": false, 00:27:41.530 "abort": true, 00:27:41.530 "seek_hole": false, 00:27:41.530 "seek_data": false, 00:27:41.530 "copy": true, 00:27:41.530 "nvme_iov_md": false 00:27:41.530 }, 00:27:41.530 "memory_domains": [ 00:27:41.530 { 00:27:41.530 "dma_device_id": "system", 00:27:41.530 "dma_device_type": 1 00:27:41.530 }, 00:27:41.530 { 00:27:41.530 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:41.530 "dma_device_type": 2 00:27:41.530 } 00:27:41.530 ], 00:27:41.530 "driver_specific": { 00:27:41.530 "passthru": { 00:27:41.530 "name": "pt1", 00:27:41.530 "base_bdev_name": "malloc1" 00:27:41.530 } 00:27:41.530 } 00:27:41.530 }' 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:41.530 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:41.789 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:42.049 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:42.049 "name": "pt2", 00:27:42.049 "aliases": [ 00:27:42.049 "00000000-0000-0000-0000-000000000002" 00:27:42.049 ], 00:27:42.049 "product_name": "passthru", 00:27:42.049 "block_size": 4128, 00:27:42.049 "num_blocks": 8192, 00:27:42.049 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:42.049 "md_size": 32, 00:27:42.049 "md_interleave": true, 00:27:42.049 "dif_type": 0, 00:27:42.049 "assigned_rate_limits": { 00:27:42.049 "rw_ios_per_sec": 0, 00:27:42.049 "rw_mbytes_per_sec": 0, 00:27:42.049 "r_mbytes_per_sec": 0, 00:27:42.049 "w_mbytes_per_sec": 0 00:27:42.049 }, 00:27:42.049 "claimed": true, 00:27:42.049 "claim_type": "exclusive_write", 00:27:42.049 "zoned": false, 00:27:42.049 "supported_io_types": { 00:27:42.049 "read": true, 00:27:42.049 "write": true, 00:27:42.049 "unmap": true, 00:27:42.049 "flush": true, 00:27:42.049 "reset": true, 00:27:42.049 "nvme_admin": false, 00:27:42.049 "nvme_io": false, 00:27:42.049 "nvme_io_md": false, 00:27:42.049 "write_zeroes": true, 00:27:42.049 "zcopy": true, 00:27:42.049 "get_zone_info": false, 00:27:42.049 "zone_management": false, 00:27:42.049 "zone_append": false, 00:27:42.049 "compare": false, 00:27:42.049 "compare_and_write": false, 00:27:42.049 "abort": true, 00:27:42.049 "seek_hole": false, 00:27:42.049 "seek_data": false, 00:27:42.049 "copy": true, 00:27:42.049 "nvme_iov_md": false 00:27:42.049 }, 00:27:42.049 "memory_domains": [ 00:27:42.049 { 00:27:42.049 "dma_device_id": "system", 00:27:42.049 "dma_device_type": 1 00:27:42.049 }, 00:27:42.049 { 00:27:42.049 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:42.049 "dma_device_type": 2 00:27:42.049 } 00:27:42.049 ], 00:27:42.049 "driver_specific": { 00:27:42.049 "passthru": { 00:27:42.049 "name": "pt2", 00:27:42.049 "base_bdev_name": "malloc2" 00:27:42.049 } 00:27:42.049 } 00:27:42.049 }' 00:27:42.049 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:42.049 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:42.049 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:42.049 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:42.049 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:42.309 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:42.309 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:42.309 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:42.309 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:42.309 13:36:22 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:42.309 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:42.309 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:42.309 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:42.309 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '.[] | .uuid' 00:27:42.569 [2024-07-25 13:36:23.210565] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:42.569 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # raid_bdev_uuid=d5a0f01b-c195-4712-a026-3ad8d732b3dd 00:27:42.569 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@451 -- # '[' -z d5a0f01b-c195-4712-a026-3ad8d732b3dd ']' 00:27:42.569 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:42.829 [2024-07-25 13:36:23.386799] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:42.829 [2024-07-25 13:36:23.386810] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:42.829 [2024-07-25 13:36:23.386845] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:42.829 [2024-07-25 13:36:23.386884] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:42.829 [2024-07-25 13:36:23.386890] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f3360 name raid_bdev1, state offline 00:27:42.829 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.829 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # jq -r '.[]' 00:27:42.829 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@457 -- # raid_bdev= 00:27:42.829 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # '[' -n '' ']' 00:27:42.829 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:27:42.829 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:43.089 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@463 -- # for i in "${base_bdevs_pt[@]}" 00:27:43.089 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:43.348 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:43.348 13:36:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@466 -- # '[' false == true ']' 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@472 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:43.609 [2024-07-25 13:36:24.353209] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:43.609 [2024-07-25 13:36:24.354271] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:43.609 [2024-07-25 13:36:24.354313] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:43.609 [2024-07-25 13:36:24.354342] bdev_raid.c:3219:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:43.609 [2024-07-25 13:36:24.354358] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:43.609 [2024-07-25 13:36:24.354363] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f5a80 name raid_bdev1, state configuring 00:27:43.609 request: 00:27:43.609 { 00:27:43.609 "name": "raid_bdev1", 00:27:43.609 "raid_level": "raid1", 00:27:43.609 "base_bdevs": [ 00:27:43.609 "malloc1", 00:27:43.609 "malloc2" 00:27:43.609 ], 00:27:43.609 "superblock": false, 00:27:43.609 "method": "bdev_raid_create", 00:27:43.609 "req_id": 1 00:27:43.609 } 00:27:43.609 Got JSON-RPC error response 00:27:43.609 response: 00:27:43.609 { 00:27:43.609 "code": -17, 00:27:43.609 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:43.609 } 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:43.609 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # jq -r '.[]' 00:27:43.869 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@474 -- # raid_bdev= 00:27:43.869 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@475 -- # '[' -n '' ']' 00:27:43.869 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@480 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:44.129 [2024-07-25 13:36:24.738143] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:44.129 [2024-07-25 13:36:24.738165] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:44.129 [2024-07-25 13:36:24.738174] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f62a0 00:27:44.129 [2024-07-25 13:36:24.738180] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:44.129 [2024-07-25 13:36:24.739301] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:44.129 [2024-07-25 13:36:24.739318] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:44.129 [2024-07-25 13:36:24.739346] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:44.129 [2024-07-25 13:36:24.739364] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:44.129 pt1 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.129 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.389 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:44.389 "name": "raid_bdev1", 00:27:44.389 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:44.389 "strip_size_kb": 0, 00:27:44.389 "state": "configuring", 00:27:44.389 "raid_level": "raid1", 00:27:44.389 "superblock": true, 00:27:44.389 "num_base_bdevs": 2, 00:27:44.389 "num_base_bdevs_discovered": 1, 00:27:44.389 "num_base_bdevs_operational": 2, 00:27:44.389 "base_bdevs_list": [ 00:27:44.389 { 00:27:44.389 "name": "pt1", 00:27:44.389 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:44.389 "is_configured": true, 00:27:44.389 "data_offset": 256, 00:27:44.389 "data_size": 7936 00:27:44.389 }, 00:27:44.389 { 00:27:44.389 "name": null, 00:27:44.389 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:44.389 "is_configured": false, 00:27:44.389 "data_offset": 256, 00:27:44.389 "data_size": 7936 00:27:44.389 } 00:27:44.389 ] 00:27:44.389 }' 00:27:44.389 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:44.389 13:36:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@485 -- # '[' 2 -gt 2 ']' 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i = 1 )) 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@494 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:44.960 [2024-07-25 13:36:25.688560] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:44.960 [2024-07-25 13:36:25.688591] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:44.960 [2024-07-25 13:36:25.688600] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f3c70 00:27:44.960 [2024-07-25 13:36:25.688606] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:44.960 [2024-07-25 13:36:25.688722] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:44.960 [2024-07-25 13:36:25.688731] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:44.960 [2024-07-25 13:36:25.688757] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:44.960 [2024-07-25 13:36:25.688768] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:44.960 [2024-07-25 13:36:25.688832] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f4c00 00:27:44.960 [2024-07-25 13:36:25.688838] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:44.960 [2024-07-25 13:36:25.688876] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f6fa20 00:27:44.960 [2024-07-25 13:36:25.688934] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f4c00 00:27:44.960 [2024-07-25 13:36:25.688940] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f4c00 00:27:44.960 [2024-07-25 13:36:25.688980] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:44.960 pt2 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i++ )) 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@493 -- # (( i < num_base_bdevs )) 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.960 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:45.221 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:45.221 "name": "raid_bdev1", 00:27:45.221 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:45.221 "strip_size_kb": 0, 00:27:45.221 "state": "online", 00:27:45.221 "raid_level": "raid1", 00:27:45.221 "superblock": true, 00:27:45.221 "num_base_bdevs": 2, 00:27:45.221 "num_base_bdevs_discovered": 2, 00:27:45.221 "num_base_bdevs_operational": 2, 00:27:45.221 "base_bdevs_list": [ 00:27:45.221 { 00:27:45.221 "name": "pt1", 00:27:45.221 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:45.221 "is_configured": true, 00:27:45.221 "data_offset": 256, 00:27:45.221 "data_size": 7936 00:27:45.221 }, 00:27:45.221 { 00:27:45.221 "name": "pt2", 00:27:45.221 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:45.221 "is_configured": true, 00:27:45.221 "data_offset": 256, 00:27:45.221 "data_size": 7936 00:27:45.221 } 00:27:45.221 ] 00:27:45.221 }' 00:27:45.221 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:45.221 13:36:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # verify_raid_bdev_properties raid_bdev1 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:45.791 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:46.050 [2024-07-25 13:36:26.631152] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:46.051 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:46.051 "name": "raid_bdev1", 00:27:46.051 "aliases": [ 00:27:46.051 "d5a0f01b-c195-4712-a026-3ad8d732b3dd" 00:27:46.051 ], 00:27:46.051 "product_name": "Raid Volume", 00:27:46.051 "block_size": 4128, 00:27:46.051 "num_blocks": 7936, 00:27:46.051 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:46.051 "md_size": 32, 00:27:46.051 "md_interleave": true, 00:27:46.051 "dif_type": 0, 00:27:46.051 "assigned_rate_limits": { 00:27:46.051 "rw_ios_per_sec": 0, 00:27:46.051 "rw_mbytes_per_sec": 0, 00:27:46.051 "r_mbytes_per_sec": 0, 00:27:46.051 "w_mbytes_per_sec": 0 00:27:46.051 }, 00:27:46.051 "claimed": false, 00:27:46.051 "zoned": false, 00:27:46.051 "supported_io_types": { 00:27:46.051 "read": true, 00:27:46.051 "write": true, 00:27:46.051 "unmap": false, 00:27:46.051 "flush": false, 00:27:46.051 "reset": true, 00:27:46.051 "nvme_admin": false, 00:27:46.051 "nvme_io": false, 00:27:46.051 "nvme_io_md": false, 00:27:46.051 "write_zeroes": true, 00:27:46.051 "zcopy": false, 00:27:46.051 "get_zone_info": false, 00:27:46.051 "zone_management": false, 00:27:46.051 "zone_append": false, 00:27:46.051 "compare": false, 00:27:46.051 "compare_and_write": false, 00:27:46.051 "abort": false, 00:27:46.051 "seek_hole": false, 00:27:46.051 "seek_data": false, 00:27:46.051 "copy": false, 00:27:46.051 "nvme_iov_md": false 00:27:46.051 }, 00:27:46.051 "memory_domains": [ 00:27:46.051 { 00:27:46.051 "dma_device_id": "system", 00:27:46.051 "dma_device_type": 1 00:27:46.051 }, 00:27:46.051 { 00:27:46.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:46.051 "dma_device_type": 2 00:27:46.051 }, 00:27:46.051 { 00:27:46.051 "dma_device_id": "system", 00:27:46.051 "dma_device_type": 1 00:27:46.051 }, 00:27:46.051 { 00:27:46.051 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:46.051 "dma_device_type": 2 00:27:46.051 } 00:27:46.051 ], 00:27:46.051 "driver_specific": { 00:27:46.051 "raid": { 00:27:46.051 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:46.051 "strip_size_kb": 0, 00:27:46.051 "state": "online", 00:27:46.051 "raid_level": "raid1", 00:27:46.051 "superblock": true, 00:27:46.051 "num_base_bdevs": 2, 00:27:46.051 "num_base_bdevs_discovered": 2, 00:27:46.051 "num_base_bdevs_operational": 2, 00:27:46.051 "base_bdevs_list": [ 00:27:46.051 { 00:27:46.051 "name": "pt1", 00:27:46.051 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:46.051 "is_configured": true, 00:27:46.051 "data_offset": 256, 00:27:46.051 "data_size": 7936 00:27:46.051 }, 00:27:46.051 { 00:27:46.051 "name": "pt2", 00:27:46.051 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:46.051 "is_configured": true, 00:27:46.051 "data_offset": 256, 00:27:46.051 "data_size": 7936 00:27:46.051 } 00:27:46.051 ] 00:27:46.051 } 00:27:46.051 } 00:27:46.051 }' 00:27:46.051 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:46.051 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:46.051 pt2' 00:27:46.051 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:46.051 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:46.051 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:46.311 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:46.311 "name": "pt1", 00:27:46.311 "aliases": [ 00:27:46.311 "00000000-0000-0000-0000-000000000001" 00:27:46.311 ], 00:27:46.311 "product_name": "passthru", 00:27:46.311 "block_size": 4128, 00:27:46.311 "num_blocks": 8192, 00:27:46.311 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:46.311 "md_size": 32, 00:27:46.311 "md_interleave": true, 00:27:46.311 "dif_type": 0, 00:27:46.311 "assigned_rate_limits": { 00:27:46.311 "rw_ios_per_sec": 0, 00:27:46.311 "rw_mbytes_per_sec": 0, 00:27:46.311 "r_mbytes_per_sec": 0, 00:27:46.311 "w_mbytes_per_sec": 0 00:27:46.311 }, 00:27:46.311 "claimed": true, 00:27:46.311 "claim_type": "exclusive_write", 00:27:46.311 "zoned": false, 00:27:46.311 "supported_io_types": { 00:27:46.311 "read": true, 00:27:46.311 "write": true, 00:27:46.311 "unmap": true, 00:27:46.311 "flush": true, 00:27:46.311 "reset": true, 00:27:46.311 "nvme_admin": false, 00:27:46.311 "nvme_io": false, 00:27:46.311 "nvme_io_md": false, 00:27:46.311 "write_zeroes": true, 00:27:46.311 "zcopy": true, 00:27:46.311 "get_zone_info": false, 00:27:46.311 "zone_management": false, 00:27:46.311 "zone_append": false, 00:27:46.311 "compare": false, 00:27:46.311 "compare_and_write": false, 00:27:46.311 "abort": true, 00:27:46.311 "seek_hole": false, 00:27:46.311 "seek_data": false, 00:27:46.311 "copy": true, 00:27:46.311 "nvme_iov_md": false 00:27:46.311 }, 00:27:46.311 "memory_domains": [ 00:27:46.311 { 00:27:46.311 "dma_device_id": "system", 00:27:46.311 "dma_device_type": 1 00:27:46.311 }, 00:27:46.311 { 00:27:46.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:46.311 "dma_device_type": 2 00:27:46.311 } 00:27:46.311 ], 00:27:46.311 "driver_specific": { 00:27:46.311 "passthru": { 00:27:46.311 "name": "pt1", 00:27:46.311 "base_bdev_name": "malloc1" 00:27:46.311 } 00:27:46.311 } 00:27:46.311 }' 00:27:46.311 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:46.311 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:46.311 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:46.311 13:36:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:46.311 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:46.311 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:46.311 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:46.572 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:46.833 "name": "pt2", 00:27:46.833 "aliases": [ 00:27:46.833 "00000000-0000-0000-0000-000000000002" 00:27:46.833 ], 00:27:46.833 "product_name": "passthru", 00:27:46.833 "block_size": 4128, 00:27:46.833 "num_blocks": 8192, 00:27:46.833 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:46.833 "md_size": 32, 00:27:46.833 "md_interleave": true, 00:27:46.833 "dif_type": 0, 00:27:46.833 "assigned_rate_limits": { 00:27:46.833 "rw_ios_per_sec": 0, 00:27:46.833 "rw_mbytes_per_sec": 0, 00:27:46.833 "r_mbytes_per_sec": 0, 00:27:46.833 "w_mbytes_per_sec": 0 00:27:46.833 }, 00:27:46.833 "claimed": true, 00:27:46.833 "claim_type": "exclusive_write", 00:27:46.833 "zoned": false, 00:27:46.833 "supported_io_types": { 00:27:46.833 "read": true, 00:27:46.833 "write": true, 00:27:46.833 "unmap": true, 00:27:46.833 "flush": true, 00:27:46.833 "reset": true, 00:27:46.833 "nvme_admin": false, 00:27:46.833 "nvme_io": false, 00:27:46.833 "nvme_io_md": false, 00:27:46.833 "write_zeroes": true, 00:27:46.833 "zcopy": true, 00:27:46.833 "get_zone_info": false, 00:27:46.833 "zone_management": false, 00:27:46.833 "zone_append": false, 00:27:46.833 "compare": false, 00:27:46.833 "compare_and_write": false, 00:27:46.833 "abort": true, 00:27:46.833 "seek_hole": false, 00:27:46.833 "seek_data": false, 00:27:46.833 "copy": true, 00:27:46.833 "nvme_iov_md": false 00:27:46.833 }, 00:27:46.833 "memory_domains": [ 00:27:46.833 { 00:27:46.833 "dma_device_id": "system", 00:27:46.833 "dma_device_type": 1 00:27:46.833 }, 00:27:46.833 { 00:27:46.833 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:46.833 "dma_device_type": 2 00:27:46.833 } 00:27:46.833 ], 00:27:46.833 "driver_specific": { 00:27:46.833 "passthru": { 00:27:46.833 "name": "pt2", 00:27:46.833 "base_bdev_name": "malloc2" 00:27:46.833 } 00:27:46.833 } 00:27:46.833 }' 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:46.833 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.093 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:47.093 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:47.093 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:47.093 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:47.093 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:47.093 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:47.093 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # jq -r '.[] | .uuid' 00:27:47.354 [2024-07-25 13:36:27.934447] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:47.354 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@502 -- # '[' d5a0f01b-c195-4712-a026-3ad8d732b3dd '!=' d5a0f01b-c195-4712-a026-3ad8d732b3dd ']' 00:27:47.354 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # has_redundancy raid1 00:27:47.354 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:47.354 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:27:47.354 13:36:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@508 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:47.354 [2024-07-25 13:36:28.126749] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@511 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:47.354 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:47.614 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.614 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.614 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:47.614 "name": "raid_bdev1", 00:27:47.614 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:47.614 "strip_size_kb": 0, 00:27:47.614 "state": "online", 00:27:47.614 "raid_level": "raid1", 00:27:47.614 "superblock": true, 00:27:47.614 "num_base_bdevs": 2, 00:27:47.614 "num_base_bdevs_discovered": 1, 00:27:47.614 "num_base_bdevs_operational": 1, 00:27:47.614 "base_bdevs_list": [ 00:27:47.614 { 00:27:47.614 "name": null, 00:27:47.614 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:47.614 "is_configured": false, 00:27:47.614 "data_offset": 256, 00:27:47.614 "data_size": 7936 00:27:47.614 }, 00:27:47.614 { 00:27:47.614 "name": "pt2", 00:27:47.614 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:47.614 "is_configured": true, 00:27:47.614 "data_offset": 256, 00:27:47.614 "data_size": 7936 00:27:47.614 } 00:27:47.614 ] 00:27:47.614 }' 00:27:47.614 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:47.614 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:48.184 13:36:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:48.444 [2024-07-25 13:36:29.029011] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:48.444 [2024-07-25 13:36:29.029027] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:48.444 [2024-07-25 13:36:29.029060] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:48.444 [2024-07-25 13:36:29.029088] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:48.444 [2024-07-25 13:36:29.029094] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f4c00 name raid_bdev1, state offline 00:27:48.444 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.444 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # jq -r '.[]' 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@515 -- # raid_bdev= 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@516 -- # '[' -n '' ']' 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i = 1 )) 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i++ )) 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@521 -- # (( i < num_base_bdevs )) 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i = 1 )) 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # (( i < num_base_bdevs - 1 )) 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@534 -- # i=1 00:27:48.706 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:48.967 [2024-07-25 13:36:29.606450] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:48.967 [2024-07-25 13:36:29.606481] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:48.967 [2024-07-25 13:36:29.606492] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f5a80 00:27:48.967 [2024-07-25 13:36:29.606498] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:48.967 [2024-07-25 13:36:29.607624] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:48.967 [2024-07-25 13:36:29.607644] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:48.967 [2024-07-25 13:36:29.607678] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:48.967 [2024-07-25 13:36:29.607696] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:48.967 [2024-07-25 13:36:29.607746] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f6f310 00:27:48.967 [2024-07-25 13:36:29.607752] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:48.967 [2024-07-25 13:36:29.607792] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f26e0 00:27:48.967 [2024-07-25 13:36:29.607848] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f6f310 00:27:48.967 [2024-07-25 13:36:29.607853] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1f6f310 00:27:48.967 [2024-07-25 13:36:29.607893] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:48.967 pt2 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@538 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.967 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:49.227 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:49.227 "name": "raid_bdev1", 00:27:49.227 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:49.227 "strip_size_kb": 0, 00:27:49.227 "state": "online", 00:27:49.227 "raid_level": "raid1", 00:27:49.227 "superblock": true, 00:27:49.227 "num_base_bdevs": 2, 00:27:49.227 "num_base_bdevs_discovered": 1, 00:27:49.227 "num_base_bdevs_operational": 1, 00:27:49.227 "base_bdevs_list": [ 00:27:49.227 { 00:27:49.227 "name": null, 00:27:49.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:49.227 "is_configured": false, 00:27:49.227 "data_offset": 256, 00:27:49.227 "data_size": 7936 00:27:49.227 }, 00:27:49.227 { 00:27:49.227 "name": "pt2", 00:27:49.227 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:49.227 "is_configured": true, 00:27:49.227 "data_offset": 256, 00:27:49.227 "data_size": 7936 00:27:49.227 } 00:27:49.227 ] 00:27:49.227 }' 00:27:49.228 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:49.228 13:36:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:49.798 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:49.798 [2024-07-25 13:36:30.536805] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:49.798 [2024-07-25 13:36:30.536824] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:49.798 [2024-07-25 13:36:30.536857] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:49.798 [2024-07-25 13:36:30.536886] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:49.798 [2024-07-25 13:36:30.536892] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f6f310 name raid_bdev1, state offline 00:27:49.798 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # jq -r '.[]' 00:27:49.798 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.059 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@542 -- # raid_bdev= 00:27:50.059 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@543 -- # '[' -n '' ']' 00:27:50.059 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@547 -- # '[' 2 -gt 2 ']' 00:27:50.059 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@555 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:50.319 [2024-07-25 13:36:30.925777] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:50.319 [2024-07-25 13:36:30.925802] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:50.319 [2024-07-25 13:36:30.925811] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x20f3ea0 00:27:50.319 [2024-07-25 13:36:30.925817] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:50.319 [2024-07-25 13:36:30.926927] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:50.319 [2024-07-25 13:36:30.926946] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:50.319 [2024-07-25 13:36:30.926976] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:50.319 [2024-07-25 13:36:30.926992] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:50.319 [2024-07-25 13:36:30.927053] bdev_raid.c:3665:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:50.319 [2024-07-25 13:36:30.927060] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:50.319 [2024-07-25 13:36:30.927067] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f4d80 name raid_bdev1, state configuring 00:27:50.319 [2024-07-25 13:36:30.927080] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:50.319 [2024-07-25 13:36:30.927115] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x20f22f0 00:27:50.319 [2024-07-25 13:36:30.927120] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:50.319 [2024-07-25 13:36:30.927159] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20f4fd0 00:27:50.319 [2024-07-25 13:36:30.927213] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x20f22f0 00:27:50.319 [2024-07-25 13:36:30.927218] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x20f22f0 00:27:50.319 [2024-07-25 13:36:30.927267] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:50.319 pt1 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 2 -gt 2 ']' 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@569 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.319 13:36:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.579 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.579 "name": "raid_bdev1", 00:27:50.579 "uuid": "d5a0f01b-c195-4712-a026-3ad8d732b3dd", 00:27:50.579 "strip_size_kb": 0, 00:27:50.579 "state": "online", 00:27:50.579 "raid_level": "raid1", 00:27:50.579 "superblock": true, 00:27:50.579 "num_base_bdevs": 2, 00:27:50.579 "num_base_bdevs_discovered": 1, 00:27:50.579 "num_base_bdevs_operational": 1, 00:27:50.579 "base_bdevs_list": [ 00:27:50.579 { 00:27:50.579 "name": null, 00:27:50.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:50.579 "is_configured": false, 00:27:50.579 "data_offset": 256, 00:27:50.579 "data_size": 7936 00:27:50.579 }, 00:27:50.579 { 00:27:50.579 "name": "pt2", 00:27:50.579 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:50.579 "is_configured": true, 00:27:50.579 "data_offset": 256, 00:27:50.579 "data_size": 7936 00:27:50.579 } 00:27:50.579 ] 00:27:50.579 }' 00:27:50.579 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.579 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:51.149 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:51.149 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:51.149 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@570 -- # [[ false == \f\a\l\s\e ]] 00:27:51.149 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:51.149 13:36:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # jq -r '.[] | .uuid' 00:27:51.410 [2024-07-25 13:36:32.064847] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@573 -- # '[' d5a0f01b-c195-4712-a026-3ad8d732b3dd '!=' d5a0f01b-c195-4712-a026-3ad8d732b3dd ']' 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@578 -- # killprocess 1055965 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1055965 ']' 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1055965 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1055965 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1055965' 00:27:51.410 killing process with pid 1055965 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@969 -- # kill 1055965 00:27:51.410 [2024-07-25 13:36:32.130872] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:51.410 [2024-07-25 13:36:32.130909] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:51.410 [2024-07-25 13:36:32.130939] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:51.410 [2024-07-25 13:36:32.130945] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x20f22f0 name raid_bdev1, state offline 00:27:51.410 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@974 -- # wait 1055965 00:27:51.410 [2024-07-25 13:36:32.140466] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:51.670 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@580 -- # return 0 00:27:51.670 00:27:51.670 real 0m13.120s 00:27:51.670 user 0m24.255s 00:27:51.670 sys 0m2.015s 00:27:51.670 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:27:51.670 13:36:32 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:51.670 ************************************ 00:27:51.670 END TEST raid_superblock_test_md_interleaved 00:27:51.670 ************************************ 00:27:51.670 13:36:32 bdev_raid -- bdev/bdev_raid.sh@994 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:27:51.670 13:36:32 bdev_raid -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:27:51.670 13:36:32 bdev_raid -- common/autotest_common.sh@1107 -- # xtrace_disable 00:27:51.670 13:36:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:51.670 ************************************ 00:27:51.670 START TEST raid_rebuild_test_sb_md_interleaved 00:27:51.670 ************************************ 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # raid_rebuild_test raid1 2 true false false 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@584 -- # local raid_level=raid1 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@585 -- # local num_base_bdevs=2 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@586 -- # local superblock=true 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@587 -- # local background_io=false 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # local verify=false 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i = 1 )) 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev1 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # echo BaseBdev2 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i++ )) 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # (( i <= num_base_bdevs )) 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@589 -- # local base_bdevs 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@590 -- # local raid_bdev_name=raid_bdev1 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # local strip_size 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # local create_arg 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@593 -- # local raid_bdev_size 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@594 -- # local data_offset 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # '[' raid1 '!=' raid1 ']' 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@604 -- # strip_size=0 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # '[' true = true ']' 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # create_arg+=' -s' 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # raid_pid=1058424 00:27:51.670 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@613 -- # waitforlisten 1058424 /var/tmp/spdk-raid.sock 00:27:51.671 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@831 -- # '[' -z 1058424 ']' 00:27:51.671 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:51.671 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:51.671 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@836 -- # local max_retries=100 00:27:51.671 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:51.671 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:51.671 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@840 -- # xtrace_disable 00:27:51.671 13:36:32 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:51.671 [2024-07-25 13:36:32.403786] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:27:51.671 [2024-07-25 13:36:32.403830] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1058424 ] 00:27:51.671 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:51.671 Zero copy mechanism will not be used. 00:27:51.931 [2024-07-25 13:36:32.488617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:51.931 [2024-07-25 13:36:32.551849] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:27:51.931 [2024-07-25 13:36:32.598076] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:51.931 [2024-07-25 13:36:32.598100] bdev_raid.c:1443:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:52.501 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:27:52.501 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@864 -- # return 0 00:27:52.501 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:52.501 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:27:52.762 BaseBdev1_malloc 00:27:52.762 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:53.022 [2024-07-25 13:36:33.620598] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:53.022 [2024-07-25 13:36:33.620635] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:53.022 [2024-07-25 13:36:33.620650] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b56630 00:27:53.022 [2024-07-25 13:36:33.620657] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:53.022 [2024-07-25 13:36:33.621815] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:53.022 [2024-07-25 13:36:33.621835] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:53.022 BaseBdev1 00:27:53.022 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@616 -- # for bdev in "${base_bdevs[@]}" 00:27:53.022 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@617 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:27:53.282 BaseBdev2_malloc 00:27:53.282 13:36:33 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:53.282 [2024-07-25 13:36:34.003680] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:53.282 [2024-07-25 13:36:34.003710] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:53.282 [2024-07-25 13:36:34.003722] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce3d90 00:27:53.282 [2024-07-25 13:36:34.003729] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:53.282 [2024-07-25 13:36:34.004843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:53.282 [2024-07-25 13:36:34.004862] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:53.282 BaseBdev2 00:27:53.282 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:27:53.543 spare_malloc 00:27:53.543 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:53.803 spare_delay 00:27:53.803 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@624 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:53.803 [2024-07-25 13:36:34.559157] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:53.803 [2024-07-25 13:36:34.559185] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:53.803 [2024-07-25 13:36:34.559199] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ce49b0 00:27:53.803 [2024-07-25 13:36:34.559206] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:53.803 [2024-07-25 13:36:34.560308] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:53.803 [2024-07-25 13:36:34.560327] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:53.803 spare 00:27:53.803 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@627 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:54.062 [2024-07-25 13:36:34.747651] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:54.062 [2024-07-25 13:36:34.748640] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:54.062 [2024-07-25 13:36:34.748746] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ce71c0 00:27:54.062 [2024-07-25 13:36:34.748753] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:54.062 [2024-07-25 13:36:34.748801] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b4f680 00:27:54.062 [2024-07-25 13:36:34.748864] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ce71c0 00:27:54.062 [2024-07-25 13:36:34.748873] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ce71c0 00:27:54.062 [2024-07-25 13:36:34.748923] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@628 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.062 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.322 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.322 "name": "raid_bdev1", 00:27:54.322 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:27:54.322 "strip_size_kb": 0, 00:27:54.322 "state": "online", 00:27:54.322 "raid_level": "raid1", 00:27:54.322 "superblock": true, 00:27:54.322 "num_base_bdevs": 2, 00:27:54.322 "num_base_bdevs_discovered": 2, 00:27:54.322 "num_base_bdevs_operational": 2, 00:27:54.322 "base_bdevs_list": [ 00:27:54.322 { 00:27:54.322 "name": "BaseBdev1", 00:27:54.322 "uuid": "19f878df-a7a0-529c-b11f-4c4b6f09e43c", 00:27:54.322 "is_configured": true, 00:27:54.322 "data_offset": 256, 00:27:54.322 "data_size": 7936 00:27:54.322 }, 00:27:54.322 { 00:27:54.322 "name": "BaseBdev2", 00:27:54.322 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:27:54.322 "is_configured": true, 00:27:54.322 "data_offset": 256, 00:27:54.322 "data_size": 7936 00:27:54.322 } 00:27:54.322 ] 00:27:54.322 }' 00:27:54.322 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.322 13:36:34 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:54.891 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:54.891 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # jq -r '.[].num_blocks' 00:27:54.891 [2024-07-25 13:36:35.638082] bdev_raid.c:1120:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:54.891 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@631 -- # raid_bdev_size=7936 00:27:54.891 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.891 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:55.150 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@634 -- # data_offset=256 00:27:55.150 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@636 -- # '[' false = true ']' 00:27:55.150 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # '[' false = true ']' 00:27:55.150 13:36:35 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:55.411 [2024-07-25 13:36:36.022829] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:55.411 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:55.670 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:55.670 "name": "raid_bdev1", 00:27:55.670 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:27:55.670 "strip_size_kb": 0, 00:27:55.670 "state": "online", 00:27:55.670 "raid_level": "raid1", 00:27:55.670 "superblock": true, 00:27:55.670 "num_base_bdevs": 2, 00:27:55.670 "num_base_bdevs_discovered": 1, 00:27:55.670 "num_base_bdevs_operational": 1, 00:27:55.670 "base_bdevs_list": [ 00:27:55.670 { 00:27:55.670 "name": null, 00:27:55.670 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:55.671 "is_configured": false, 00:27:55.671 "data_offset": 256, 00:27:55.671 "data_size": 7936 00:27:55.671 }, 00:27:55.671 { 00:27:55.671 "name": "BaseBdev2", 00:27:55.671 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:27:55.671 "is_configured": true, 00:27:55.671 "data_offset": 256, 00:27:55.671 "data_size": 7936 00:27:55.671 } 00:27:55.671 ] 00:27:55.671 }' 00:27:55.671 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:55.671 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:56.240 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:56.240 [2024-07-25 13:36:36.945185] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:56.240 [2024-07-25 13:36:36.947731] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b4f050 00:27:56.240 [2024-07-25 13:36:36.949325] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:56.240 13:36:36 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:57.181 13:36:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:57.181 13:36:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:57.181 13:36:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:57.181 13:36:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:57.181 13:36:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:57.181 13:36:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.181 13:36:37 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.441 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:57.441 "name": "raid_bdev1", 00:27:57.441 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:27:57.441 "strip_size_kb": 0, 00:27:57.441 "state": "online", 00:27:57.441 "raid_level": "raid1", 00:27:57.441 "superblock": true, 00:27:57.441 "num_base_bdevs": 2, 00:27:57.441 "num_base_bdevs_discovered": 2, 00:27:57.441 "num_base_bdevs_operational": 2, 00:27:57.441 "process": { 00:27:57.441 "type": "rebuild", 00:27:57.441 "target": "spare", 00:27:57.441 "progress": { 00:27:57.441 "blocks": 2816, 00:27:57.441 "percent": 35 00:27:57.441 } 00:27:57.441 }, 00:27:57.441 "base_bdevs_list": [ 00:27:57.441 { 00:27:57.441 "name": "spare", 00:27:57.441 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:27:57.441 "is_configured": true, 00:27:57.441 "data_offset": 256, 00:27:57.441 "data_size": 7936 00:27:57.442 }, 00:27:57.442 { 00:27:57.442 "name": "BaseBdev2", 00:27:57.442 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:27:57.442 "is_configured": true, 00:27:57.442 "data_offset": 256, 00:27:57.442 "data_size": 7936 00:27:57.442 } 00:27:57.442 ] 00:27:57.442 }' 00:27:57.442 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:57.442 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:57.442 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:57.442 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:57.442 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@668 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:57.702 [2024-07-25 13:36:38.401779] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:57.702 [2024-07-25 13:36:38.458212] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:57.702 [2024-07-25 13:36:38.458244] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:57.702 [2024-07-25 13:36:38.458254] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:57.702 [2024-07-25 13:36:38.458258] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@671 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.702 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.962 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.962 "name": "raid_bdev1", 00:27:57.962 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:27:57.962 "strip_size_kb": 0, 00:27:57.962 "state": "online", 00:27:57.962 "raid_level": "raid1", 00:27:57.962 "superblock": true, 00:27:57.962 "num_base_bdevs": 2, 00:27:57.962 "num_base_bdevs_discovered": 1, 00:27:57.962 "num_base_bdevs_operational": 1, 00:27:57.962 "base_bdevs_list": [ 00:27:57.962 { 00:27:57.963 "name": null, 00:27:57.963 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.963 "is_configured": false, 00:27:57.963 "data_offset": 256, 00:27:57.963 "data_size": 7936 00:27:57.963 }, 00:27:57.963 { 00:27:57.963 "name": "BaseBdev2", 00:27:57.963 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:27:57.963 "is_configured": true, 00:27:57.963 "data_offset": 256, 00:27:57.963 "data_size": 7936 00:27:57.963 } 00:27:57.963 ] 00:27:57.963 }' 00:27:57.963 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.963 13:36:38 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:58.532 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@674 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:58.532 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:58.532 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:58.532 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:58.532 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:58.532 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.532 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.793 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:58.793 "name": "raid_bdev1", 00:27:58.793 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:27:58.793 "strip_size_kb": 0, 00:27:58.793 "state": "online", 00:27:58.793 "raid_level": "raid1", 00:27:58.793 "superblock": true, 00:27:58.793 "num_base_bdevs": 2, 00:27:58.793 "num_base_bdevs_discovered": 1, 00:27:58.793 "num_base_bdevs_operational": 1, 00:27:58.793 "base_bdevs_list": [ 00:27:58.793 { 00:27:58.793 "name": null, 00:27:58.793 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.793 "is_configured": false, 00:27:58.793 "data_offset": 256, 00:27:58.793 "data_size": 7936 00:27:58.793 }, 00:27:58.793 { 00:27:58.793 "name": "BaseBdev2", 00:27:58.793 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:27:58.793 "is_configured": true, 00:27:58.793 "data_offset": 256, 00:27:58.793 "data_size": 7936 00:27:58.793 } 00:27:58.793 ] 00:27:58.793 }' 00:27:58.793 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:58.793 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:58.793 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:58.793 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:58.793 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@677 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:59.052 [2024-07-25 13:36:39.701093] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:59.052 [2024-07-25 13:36:39.703535] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ce75b0 00:27:59.052 [2024-07-25 13:36:39.704659] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:59.052 13:36:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@678 -- # sleep 1 00:27:59.992 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@679 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:59.992 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:59.992 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:59.992 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:59.992 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:59.992 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:59.992 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.252 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:00.252 "name": "raid_bdev1", 00:28:00.252 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:00.252 "strip_size_kb": 0, 00:28:00.252 "state": "online", 00:28:00.252 "raid_level": "raid1", 00:28:00.252 "superblock": true, 00:28:00.252 "num_base_bdevs": 2, 00:28:00.252 "num_base_bdevs_discovered": 2, 00:28:00.252 "num_base_bdevs_operational": 2, 00:28:00.252 "process": { 00:28:00.252 "type": "rebuild", 00:28:00.252 "target": "spare", 00:28:00.252 "progress": { 00:28:00.252 "blocks": 2816, 00:28:00.252 "percent": 35 00:28:00.252 } 00:28:00.252 }, 00:28:00.252 "base_bdevs_list": [ 00:28:00.252 { 00:28:00.252 "name": "spare", 00:28:00.252 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:00.252 "is_configured": true, 00:28:00.252 "data_offset": 256, 00:28:00.252 "data_size": 7936 00:28:00.252 }, 00:28:00.252 { 00:28:00.252 "name": "BaseBdev2", 00:28:00.252 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:00.252 "is_configured": true, 00:28:00.252 "data_offset": 256, 00:28:00.252 "data_size": 7936 00:28:00.252 } 00:28:00.252 ] 00:28:00.252 }' 00:28:00.252 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:00.252 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:00.252 13:36:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' true = true ']' 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@681 -- # '[' = false ']' 00:28:00.512 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 681: [: =: unary operator expected 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # local num_base_bdevs_operational=2 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' raid1 = raid1 ']' 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # '[' 2 -gt 2 ']' 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # local timeout=1058 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.512 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.082 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:01.082 "name": "raid_bdev1", 00:28:01.082 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:01.082 "strip_size_kb": 0, 00:28:01.082 "state": "online", 00:28:01.082 "raid_level": "raid1", 00:28:01.082 "superblock": true, 00:28:01.082 "num_base_bdevs": 2, 00:28:01.082 "num_base_bdevs_discovered": 2, 00:28:01.082 "num_base_bdevs_operational": 2, 00:28:01.082 "process": { 00:28:01.082 "type": "rebuild", 00:28:01.082 "target": "spare", 00:28:01.082 "progress": { 00:28:01.082 "blocks": 4608, 00:28:01.082 "percent": 58 00:28:01.082 } 00:28:01.082 }, 00:28:01.082 "base_bdevs_list": [ 00:28:01.082 { 00:28:01.082 "name": "spare", 00:28:01.082 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:01.082 "is_configured": true, 00:28:01.082 "data_offset": 256, 00:28:01.082 "data_size": 7936 00:28:01.082 }, 00:28:01.082 { 00:28:01.082 "name": "BaseBdev2", 00:28:01.082 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:01.082 "is_configured": true, 00:28:01.082 "data_offset": 256, 00:28:01.082 "data_size": 7936 00:28:01.082 } 00:28:01.082 ] 00:28:01.082 }' 00:28:01.082 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:01.082 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:01.082 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:01.082 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:01.082 13:36:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@726 -- # sleep 1 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@722 -- # (( SECONDS < timeout )) 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@723 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.021 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:02.282 [2024-07-25 13:36:42.822766] bdev_raid.c:2886:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:28:02.282 [2024-07-25 13:36:42.822811] bdev_raid.c:2548:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:28:02.282 [2024-07-25 13:36:42.822879] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:02.282 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:02.282 "name": "raid_bdev1", 00:28:02.282 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:02.282 "strip_size_kb": 0, 00:28:02.282 "state": "online", 00:28:02.282 "raid_level": "raid1", 00:28:02.282 "superblock": true, 00:28:02.282 "num_base_bdevs": 2, 00:28:02.282 "num_base_bdevs_discovered": 2, 00:28:02.282 "num_base_bdevs_operational": 2, 00:28:02.282 "base_bdevs_list": [ 00:28:02.282 { 00:28:02.282 "name": "spare", 00:28:02.282 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:02.282 "is_configured": true, 00:28:02.282 "data_offset": 256, 00:28:02.282 "data_size": 7936 00:28:02.282 }, 00:28:02.282 { 00:28:02.282 "name": "BaseBdev2", 00:28:02.282 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:02.282 "is_configured": true, 00:28:02.282 "data_offset": 256, 00:28:02.282 "data_size": 7936 00:28:02.282 } 00:28:02.282 ] 00:28:02.282 }' 00:28:02.282 13:36:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:02.282 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:28:02.282 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@724 -- # break 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@730 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:02.542 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:03.120 "name": "raid_bdev1", 00:28:03.120 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:03.120 "strip_size_kb": 0, 00:28:03.120 "state": "online", 00:28:03.120 "raid_level": "raid1", 00:28:03.120 "superblock": true, 00:28:03.120 "num_base_bdevs": 2, 00:28:03.120 "num_base_bdevs_discovered": 2, 00:28:03.120 "num_base_bdevs_operational": 2, 00:28:03.120 "base_bdevs_list": [ 00:28:03.120 { 00:28:03.120 "name": "spare", 00:28:03.120 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:03.120 "is_configured": true, 00:28:03.120 "data_offset": 256, 00:28:03.120 "data_size": 7936 00:28:03.120 }, 00:28:03.120 { 00:28:03.120 "name": "BaseBdev2", 00:28:03.120 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:03.120 "is_configured": true, 00:28:03.120 "data_offset": 256, 00:28:03.120 "data_size": 7936 00:28:03.120 } 00:28:03.120 ] 00:28:03.120 }' 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@731 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:03.120 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:03.121 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.121 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.121 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.121 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.121 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.121 13:36:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.426 13:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.426 "name": "raid_bdev1", 00:28:03.426 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:03.426 "strip_size_kb": 0, 00:28:03.426 "state": "online", 00:28:03.426 "raid_level": "raid1", 00:28:03.426 "superblock": true, 00:28:03.426 "num_base_bdevs": 2, 00:28:03.426 "num_base_bdevs_discovered": 2, 00:28:03.426 "num_base_bdevs_operational": 2, 00:28:03.426 "base_bdevs_list": [ 00:28:03.426 { 00:28:03.426 "name": "spare", 00:28:03.426 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:03.426 "is_configured": true, 00:28:03.426 "data_offset": 256, 00:28:03.426 "data_size": 7936 00:28:03.426 }, 00:28:03.426 { 00:28:03.426 "name": "BaseBdev2", 00:28:03.426 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:03.426 "is_configured": true, 00:28:03.426 "data_offset": 256, 00:28:03.426 "data_size": 7936 00:28:03.426 } 00:28:03.426 ] 00:28:03.426 }' 00:28:03.426 13:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.426 13:36:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:04.367 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@734 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:28:04.627 [2024-07-25 13:36:45.207404] bdev_raid.c:2398:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:28:04.627 [2024-07-25 13:36:45.207421] bdev_raid.c:1886:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:28:04.627 [2024-07-25 13:36:45.207459] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:04.627 [2024-07-25 13:36:45.207498] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:04.627 [2024-07-25 13:36:45.207504] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ce71c0 name raid_bdev1, state offline 00:28:04.627 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:04.627 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # jq length 00:28:05.197 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@735 -- # [[ 0 == 0 ]] 00:28:05.197 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@737 -- # '[' false = true ']' 00:28:05.197 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@758 -- # '[' true = true ']' 00:28:05.197 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:05.197 13:36:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:05.456 [2024-07-25 13:36:46.201857] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:05.456 [2024-07-25 13:36:46.201884] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:05.456 [2024-07-25 13:36:46.201897] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b4c660 00:28:05.456 [2024-07-25 13:36:46.201903] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:05.456 [2024-07-25 13:36:46.203212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:05.456 [2024-07-25 13:36:46.203235] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:05.456 [2024-07-25 13:36:46.203275] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:05.456 [2024-07-25 13:36:46.203294] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:05.456 [2024-07-25 13:36:46.203362] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:28:05.456 spare 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:05.456 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:05.716 [2024-07-25 13:36:46.303645] bdev_raid.c:1721:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cf09a0 00:28:05.716 [2024-07-25 13:36:46.303655] bdev_raid.c:1722:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:28:05.716 [2024-07-25 13:36:46.303703] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b4db70 00:28:05.716 [2024-07-25 13:36:46.303771] bdev_raid.c:1751:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cf09a0 00:28:05.716 [2024-07-25 13:36:46.303776] bdev_raid.c:1752:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cf09a0 00:28:05.716 [2024-07-25 13:36:46.303827] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:05.716 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:05.716 "name": "raid_bdev1", 00:28:05.716 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:05.716 "strip_size_kb": 0, 00:28:05.716 "state": "online", 00:28:05.716 "raid_level": "raid1", 00:28:05.716 "superblock": true, 00:28:05.716 "num_base_bdevs": 2, 00:28:05.716 "num_base_bdevs_discovered": 2, 00:28:05.716 "num_base_bdevs_operational": 2, 00:28:05.716 "base_bdevs_list": [ 00:28:05.716 { 00:28:05.716 "name": "spare", 00:28:05.716 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:05.716 "is_configured": true, 00:28:05.716 "data_offset": 256, 00:28:05.716 "data_size": 7936 00:28:05.716 }, 00:28:05.716 { 00:28:05.716 "name": "BaseBdev2", 00:28:05.716 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:05.716 "is_configured": true, 00:28:05.716 "data_offset": 256, 00:28:05.716 "data_size": 7936 00:28:05.716 } 00:28:05.716 ] 00:28:05.716 }' 00:28:05.716 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:05.716 13:36:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@764 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:07.099 "name": "raid_bdev1", 00:28:07.099 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:07.099 "strip_size_kb": 0, 00:28:07.099 "state": "online", 00:28:07.099 "raid_level": "raid1", 00:28:07.099 "superblock": true, 00:28:07.099 "num_base_bdevs": 2, 00:28:07.099 "num_base_bdevs_discovered": 2, 00:28:07.099 "num_base_bdevs_operational": 2, 00:28:07.099 "base_bdevs_list": [ 00:28:07.099 { 00:28:07.099 "name": "spare", 00:28:07.099 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:07.099 "is_configured": true, 00:28:07.099 "data_offset": 256, 00:28:07.099 "data_size": 7936 00:28:07.099 }, 00:28:07.099 { 00:28:07.099 "name": "BaseBdev2", 00:28:07.099 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:07.099 "is_configured": true, 00:28:07.099 "data_offset": 256, 00:28:07.099 "data_size": 7936 00:28:07.099 } 00:28:07.099 ] 00:28:07.099 }' 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:07.099 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:07.358 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:07.358 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:07.358 13:36:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # jq -r '.[].base_bdevs_list[0].name' 00:28:07.619 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@765 -- # [[ spare == \s\p\a\r\e ]] 00:28:07.619 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:28:08.189 [2024-07-25 13:36:48.716335] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@769 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:08.189 13:36:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:08.449 13:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:08.449 "name": "raid_bdev1", 00:28:08.449 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:08.449 "strip_size_kb": 0, 00:28:08.449 "state": "online", 00:28:08.449 "raid_level": "raid1", 00:28:08.449 "superblock": true, 00:28:08.449 "num_base_bdevs": 2, 00:28:08.449 "num_base_bdevs_discovered": 1, 00:28:08.449 "num_base_bdevs_operational": 1, 00:28:08.449 "base_bdevs_list": [ 00:28:08.449 { 00:28:08.449 "name": null, 00:28:08.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:08.449 "is_configured": false, 00:28:08.449 "data_offset": 256, 00:28:08.449 "data_size": 7936 00:28:08.449 }, 00:28:08.449 { 00:28:08.449 "name": "BaseBdev2", 00:28:08.449 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:08.449 "is_configured": true, 00:28:08.449 "data_offset": 256, 00:28:08.449 "data_size": 7936 00:28:08.449 } 00:28:08.449 ] 00:28:08.449 }' 00:28:08.449 13:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:08.449 13:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:09.019 13:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@770 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:28:09.279 [2024-07-25 13:36:49.831173] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:09.279 [2024-07-25 13:36:49.831284] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:09.279 [2024-07-25 13:36:49.831293] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:09.279 [2024-07-25 13:36:49.831311] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:09.279 [2024-07-25 13:36:49.833773] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b4ce60 00:28:09.279 [2024-07-25 13:36:49.835364] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:09.279 13:36:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # sleep 1 00:28:10.218 13:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:10.218 13:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:10.218 13:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:10.218 13:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:10.218 13:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:10.218 13:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.218 13:36:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:10.478 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:10.478 "name": "raid_bdev1", 00:28:10.478 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:10.478 "strip_size_kb": 0, 00:28:10.478 "state": "online", 00:28:10.478 "raid_level": "raid1", 00:28:10.478 "superblock": true, 00:28:10.478 "num_base_bdevs": 2, 00:28:10.478 "num_base_bdevs_discovered": 2, 00:28:10.478 "num_base_bdevs_operational": 2, 00:28:10.478 "process": { 00:28:10.478 "type": "rebuild", 00:28:10.478 "target": "spare", 00:28:10.478 "progress": { 00:28:10.478 "blocks": 2816, 00:28:10.478 "percent": 35 00:28:10.478 } 00:28:10.478 }, 00:28:10.478 "base_bdevs_list": [ 00:28:10.478 { 00:28:10.478 "name": "spare", 00:28:10.478 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:10.478 "is_configured": true, 00:28:10.478 "data_offset": 256, 00:28:10.478 "data_size": 7936 00:28:10.478 }, 00:28:10.478 { 00:28:10.478 "name": "BaseBdev2", 00:28:10.478 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:10.478 "is_configured": true, 00:28:10.478 "data_offset": 256, 00:28:10.478 "data_size": 7936 00:28:10.478 } 00:28:10.478 ] 00:28:10.478 }' 00:28:10.478 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:10.478 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:10.478 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:10.478 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:10.478 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:10.738 [2024-07-25 13:36:51.384651] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:10.738 [2024-07-25 13:36:51.444775] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:10.738 [2024-07-25 13:36:51.444806] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:10.738 [2024-07-25 13:36:51.444815] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:10.738 [2024-07-25 13:36:51.444819] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:10.738 13:36:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:11.307 13:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:11.307 "name": "raid_bdev1", 00:28:11.307 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:11.307 "strip_size_kb": 0, 00:28:11.307 "state": "online", 00:28:11.307 "raid_level": "raid1", 00:28:11.307 "superblock": true, 00:28:11.307 "num_base_bdevs": 2, 00:28:11.307 "num_base_bdevs_discovered": 1, 00:28:11.307 "num_base_bdevs_operational": 1, 00:28:11.307 "base_bdevs_list": [ 00:28:11.307 { 00:28:11.307 "name": null, 00:28:11.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:11.307 "is_configured": false, 00:28:11.307 "data_offset": 256, 00:28:11.307 "data_size": 7936 00:28:11.307 }, 00:28:11.307 { 00:28:11.307 "name": "BaseBdev2", 00:28:11.307 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:11.307 "is_configured": true, 00:28:11.307 "data_offset": 256, 00:28:11.307 "data_size": 7936 00:28:11.307 } 00:28:11.307 ] 00:28:11.307 }' 00:28:11.307 13:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:11.307 13:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:12.246 13:36:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:28:12.506 [2024-07-25 13:36:53.129045] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:28:12.507 [2024-07-25 13:36:53.129077] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:12.507 [2024-07-25 13:36:53.129090] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b4ec10 00:28:12.507 [2024-07-25 13:36:53.129096] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:12.507 [2024-07-25 13:36:53.129243] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:12.507 [2024-07-25 13:36:53.129253] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:28:12.507 [2024-07-25 13:36:53.129292] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:28:12.507 [2024-07-25 13:36:53.129299] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:28:12.507 [2024-07-25 13:36:53.129305] bdev_raid.c:3738:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:28:12.507 [2024-07-25 13:36:53.129318] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:28:12.507 [2024-07-25 13:36:53.131704] bdev_raid.c: 263:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b4ffa0 00:28:12.507 [2024-07-25 13:36:53.132823] bdev_raid.c:2921:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:28:12.507 spare 00:28:12.507 13:36:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # sleep 1 00:28:13.448 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:28:13.448 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:13.448 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:28:13.448 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:28:13.448 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:13.448 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:13.448 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:13.709 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:13.709 "name": "raid_bdev1", 00:28:13.709 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:13.709 "strip_size_kb": 0, 00:28:13.709 "state": "online", 00:28:13.709 "raid_level": "raid1", 00:28:13.709 "superblock": true, 00:28:13.709 "num_base_bdevs": 2, 00:28:13.709 "num_base_bdevs_discovered": 2, 00:28:13.709 "num_base_bdevs_operational": 2, 00:28:13.709 "process": { 00:28:13.709 "type": "rebuild", 00:28:13.709 "target": "spare", 00:28:13.709 "progress": { 00:28:13.709 "blocks": 2816, 00:28:13.709 "percent": 35 00:28:13.709 } 00:28:13.709 }, 00:28:13.709 "base_bdevs_list": [ 00:28:13.709 { 00:28:13.709 "name": "spare", 00:28:13.709 "uuid": "169abefa-6586-5ffc-b302-98c9fa0bc02b", 00:28:13.709 "is_configured": true, 00:28:13.709 "data_offset": 256, 00:28:13.709 "data_size": 7936 00:28:13.709 }, 00:28:13.709 { 00:28:13.709 "name": "BaseBdev2", 00:28:13.709 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:13.709 "is_configured": true, 00:28:13.709 "data_offset": 256, 00:28:13.709 "data_size": 7936 00:28:13.709 } 00:28:13.709 ] 00:28:13.709 }' 00:28:13.709 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:13.709 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:28:13.709 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:13.969 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:28:13.969 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:28:13.969 [2024-07-25 13:36:54.682097] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:13.969 [2024-07-25 13:36:54.742222] bdev_raid.c:2557:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:28:13.969 [2024-07-25 13:36:54.742253] bdev_raid.c: 343:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:28:13.969 [2024-07-25 13:36:54.742262] bdev_raid.c:2162:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:28:13.969 [2024-07-25 13:36:54.742267] bdev_raid.c:2495:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@783 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:14.230 13:36:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:14.800 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:14.800 "name": "raid_bdev1", 00:28:14.800 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:14.800 "strip_size_kb": 0, 00:28:14.800 "state": "online", 00:28:14.800 "raid_level": "raid1", 00:28:14.800 "superblock": true, 00:28:14.800 "num_base_bdevs": 2, 00:28:14.800 "num_base_bdevs_discovered": 1, 00:28:14.800 "num_base_bdevs_operational": 1, 00:28:14.800 "base_bdevs_list": [ 00:28:14.800 { 00:28:14.800 "name": null, 00:28:14.800 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:14.800 "is_configured": false, 00:28:14.800 "data_offset": 256, 00:28:14.800 "data_size": 7936 00:28:14.800 }, 00:28:14.800 { 00:28:14.800 "name": "BaseBdev2", 00:28:14.800 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:14.800 "is_configured": true, 00:28:14.800 "data_offset": 256, 00:28:14.800 "data_size": 7936 00:28:14.800 } 00:28:14.800 ] 00:28:14.800 }' 00:28:14.800 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:14.800 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:15.370 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:15.370 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:15.370 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:15.370 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:15.370 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:15.370 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:15.370 13:36:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:15.370 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:15.370 "name": "raid_bdev1", 00:28:15.370 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:15.370 "strip_size_kb": 0, 00:28:15.370 "state": "online", 00:28:15.370 "raid_level": "raid1", 00:28:15.370 "superblock": true, 00:28:15.370 "num_base_bdevs": 2, 00:28:15.370 "num_base_bdevs_discovered": 1, 00:28:15.370 "num_base_bdevs_operational": 1, 00:28:15.370 "base_bdevs_list": [ 00:28:15.370 { 00:28:15.370 "name": null, 00:28:15.370 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:15.370 "is_configured": false, 00:28:15.370 "data_offset": 256, 00:28:15.370 "data_size": 7936 00:28:15.370 }, 00:28:15.370 { 00:28:15.370 "name": "BaseBdev2", 00:28:15.370 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:15.370 "is_configured": true, 00:28:15.370 "data_offset": 256, 00:28:15.370 "data_size": 7936 00:28:15.370 } 00:28:15.370 ] 00:28:15.370 }' 00:28:15.370 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:15.631 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:15.631 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:15.631 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:15.631 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@787 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:28:16.201 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@788 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:28:16.201 [2024-07-25 13:36:56.947828] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:28:16.201 [2024-07-25 13:36:56.947856] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:28:16.201 [2024-07-25 13:36:56.947868] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b55f70 00:28:16.201 [2024-07-25 13:36:56.947875] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:28:16.201 [2024-07-25 13:36:56.948004] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:28:16.201 [2024-07-25 13:36:56.948014] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:28:16.201 [2024-07-25 13:36:56.948045] bdev_raid.c:3875:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:28:16.201 [2024-07-25 13:36:56.948053] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:16.201 [2024-07-25 13:36:56.948060] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:16.201 BaseBdev1 00:28:16.201 13:36:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@789 -- # sleep 1 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@790 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:17.584 13:36:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:17.845 13:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:17.845 "name": "raid_bdev1", 00:28:17.845 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:17.845 "strip_size_kb": 0, 00:28:17.845 "state": "online", 00:28:17.845 "raid_level": "raid1", 00:28:17.845 "superblock": true, 00:28:17.845 "num_base_bdevs": 2, 00:28:17.845 "num_base_bdevs_discovered": 1, 00:28:17.845 "num_base_bdevs_operational": 1, 00:28:17.845 "base_bdevs_list": [ 00:28:17.845 { 00:28:17.845 "name": null, 00:28:17.845 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:17.845 "is_configured": false, 00:28:17.845 "data_offset": 256, 00:28:17.845 "data_size": 7936 00:28:17.845 }, 00:28:17.845 { 00:28:17.845 "name": "BaseBdev2", 00:28:17.845 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:17.845 "is_configured": true, 00:28:17.845 "data_offset": 256, 00:28:17.845 "data_size": 7936 00:28:17.845 } 00:28:17.845 ] 00:28:17.845 }' 00:28:17.845 13:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:17.845 13:36:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:18.786 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@791 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:18.786 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:18.786 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:18.786 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:18.786 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:18.786 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:18.786 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:19.046 "name": "raid_bdev1", 00:28:19.046 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:19.046 "strip_size_kb": 0, 00:28:19.046 "state": "online", 00:28:19.046 "raid_level": "raid1", 00:28:19.046 "superblock": true, 00:28:19.046 "num_base_bdevs": 2, 00:28:19.046 "num_base_bdevs_discovered": 1, 00:28:19.046 "num_base_bdevs_operational": 1, 00:28:19.046 "base_bdevs_list": [ 00:28:19.046 { 00:28:19.046 "name": null, 00:28:19.046 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:19.046 "is_configured": false, 00:28:19.046 "data_offset": 256, 00:28:19.046 "data_size": 7936 00:28:19.046 }, 00:28:19.046 { 00:28:19.046 "name": "BaseBdev2", 00:28:19.046 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:19.046 "is_configured": true, 00:28:19.046 "data_offset": 256, 00:28:19.046 "data_size": 7936 00:28:19.046 } 00:28:19.046 ] 00:28:19.046 }' 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@792 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@650 -- # local es=0 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:19.046 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@638 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@642 -- # case "$(type -t "$arg")" in 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@644 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:19.306 13:36:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:19.877 [2024-07-25 13:37:00.368510] bdev_raid.c:3312:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:19.877 [2024-07-25 13:37:00.368607] bdev_raid.c:3680:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:19.877 [2024-07-25 13:37:00.368616] bdev_raid.c:3699:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:19.877 request: 00:28:19.877 { 00:28:19.877 "base_bdev": "BaseBdev1", 00:28:19.877 "raid_bdev": "raid_bdev1", 00:28:19.877 "method": "bdev_raid_add_base_bdev", 00:28:19.877 "req_id": 1 00:28:19.877 } 00:28:19.877 Got JSON-RPC error response 00:28:19.877 response: 00:28:19.877 { 00:28:19.877 "code": -22, 00:28:19.877 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:19.877 } 00:28:19.877 13:37:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@653 -- # es=1 00:28:19.877 13:37:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@661 -- # (( es > 128 )) 00:28:19.877 13:37:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@672 -- # [[ -n '' ]] 00:28:19.877 13:37:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@677 -- # (( !es == 0 )) 00:28:19.877 13:37:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@793 -- # sleep 1 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@794 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:20.816 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:21.386 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:21.386 "name": "raid_bdev1", 00:28:21.386 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:21.386 "strip_size_kb": 0, 00:28:21.386 "state": "online", 00:28:21.386 "raid_level": "raid1", 00:28:21.386 "superblock": true, 00:28:21.386 "num_base_bdevs": 2, 00:28:21.386 "num_base_bdevs_discovered": 1, 00:28:21.386 "num_base_bdevs_operational": 1, 00:28:21.386 "base_bdevs_list": [ 00:28:21.386 { 00:28:21.386 "name": null, 00:28:21.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:21.386 "is_configured": false, 00:28:21.386 "data_offset": 256, 00:28:21.386 "data_size": 7936 00:28:21.386 }, 00:28:21.386 { 00:28:21.386 "name": "BaseBdev2", 00:28:21.386 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:21.386 "is_configured": true, 00:28:21.386 "data_offset": 256, 00:28:21.386 "data_size": 7936 00:28:21.386 } 00:28:21.386 ] 00:28:21.386 }' 00:28:21.386 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:21.386 13:37:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:21.955 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@795 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:21.955 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:21.955 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:21.955 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:21.955 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:21.955 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:21.955 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:22.215 "name": "raid_bdev1", 00:28:22.215 "uuid": "1b1cfc88-2690-4c60-8a21-a4211c5aab41", 00:28:22.215 "strip_size_kb": 0, 00:28:22.215 "state": "online", 00:28:22.215 "raid_level": "raid1", 00:28:22.215 "superblock": true, 00:28:22.215 "num_base_bdevs": 2, 00:28:22.215 "num_base_bdevs_discovered": 1, 00:28:22.215 "num_base_bdevs_operational": 1, 00:28:22.215 "base_bdevs_list": [ 00:28:22.215 { 00:28:22.215 "name": null, 00:28:22.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:22.215 "is_configured": false, 00:28:22.215 "data_offset": 256, 00:28:22.215 "data_size": 7936 00:28:22.215 }, 00:28:22.215 { 00:28:22.215 "name": "BaseBdev2", 00:28:22.215 "uuid": "6e5203d0-13e0-5613-9be8-3783c9aff520", 00:28:22.215 "is_configured": true, 00:28:22.215 "data_offset": 256, 00:28:22.215 "data_size": 7936 00:28:22.215 } 00:28:22.215 ] 00:28:22.215 }' 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@798 -- # killprocess 1058424 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@950 -- # '[' -z 1058424 ']' 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # kill -0 1058424 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # uname 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1058424 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1058424' 00:28:22.215 killing process with pid 1058424 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@969 -- # kill 1058424 00:28:22.215 Received shutdown signal, test time was about 60.000000 seconds 00:28:22.215 00:28:22.215 Latency(us) 00:28:22.215 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:22.215 =================================================================================================================== 00:28:22.215 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:22.215 [2024-07-25 13:37:02.924414] bdev_raid.c:1374:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:22.215 [2024-07-25 13:37:02.924476] bdev_raid.c: 487:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:22.215 [2024-07-25 13:37:02.924505] bdev_raid.c: 464:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:22.215 [2024-07-25 13:37:02.924511] bdev_raid.c: 378:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf09a0 name raid_bdev1, state offline 00:28:22.215 13:37:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@974 -- # wait 1058424 00:28:22.215 [2024-07-25 13:37:02.939982] bdev_raid.c:1400:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:22.474 13:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@800 -- # return 0 00:28:22.474 00:28:22.474 real 0m30.717s 00:28:22.475 user 0m51.695s 00:28:22.475 sys 0m3.162s 00:28:22.475 13:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:22.475 13:37:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:22.475 ************************************ 00:28:22.475 END TEST raid_rebuild_test_sb_md_interleaved 00:28:22.475 ************************************ 00:28:22.475 13:37:03 bdev_raid -- bdev/bdev_raid.sh@996 -- # trap - EXIT 00:28:22.475 13:37:03 bdev_raid -- bdev/bdev_raid.sh@997 -- # cleanup 00:28:22.475 13:37:03 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 1058424 ']' 00:28:22.475 13:37:03 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 1058424 00:28:22.475 13:37:03 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:28:22.475 00:28:22.475 real 17m29.965s 00:28:22.475 user 30m10.957s 00:28:22.475 sys 2m32.684s 00:28:22.475 13:37:03 bdev_raid -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:22.475 13:37:03 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:22.475 ************************************ 00:28:22.475 END TEST bdev_raid 00:28:22.475 ************************************ 00:28:22.475 13:37:03 -- spdk/autotest.sh@195 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:22.475 13:37:03 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:28:22.475 13:37:03 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:22.475 13:37:03 -- common/autotest_common.sh@10 -- # set +x 00:28:22.475 ************************************ 00:28:22.475 START TEST bdevperf_config 00:28:22.475 ************************************ 00:28:22.475 13:37:03 bdevperf_config -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:22.737 * Looking for test storage... 00:28:22.737 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:22.737 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:22.737 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:22.737 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:22.737 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:22.737 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:22.737 13:37:03 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:25.281 13:37:05 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-07-25 13:37:03.435418] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:25.281 [2024-07-25 13:37:03.435480] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064067 ] 00:28:25.281 Using job config with 4 jobs 00:28:25.281 [2024-07-25 13:37:03.547990] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:25.281 [2024-07-25 13:37:03.624014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:25.281 cpumask for '\''job0'\'' is too big 00:28:25.281 cpumask for '\''job1'\'' is too big 00:28:25.281 cpumask for '\''job2'\'' is too big 00:28:25.281 cpumask for '\''job3'\'' is too big 00:28:25.281 Running I/O for 2 seconds... 00:28:25.281 00:28:25.281 Latency(us) 00:28:25.281 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28296.17 27.63 0.00 0.00 9039.19 1600.59 13913.80 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28274.03 27.61 0.00 0.00 9029.11 1575.38 12300.60 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28251.85 27.59 0.00 0.00 9018.71 1575.38 10737.82 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28229.79 27.57 0.00 0.00 9008.59 1575.38 9275.86 00:28:25.281 =================================================================================================================== 00:28:25.281 Total : 113051.83 110.40 0.00 0.00 9023.90 1575.38 13913.80' 00:28:25.281 13:37:05 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-07-25 13:37:03.435418] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:25.281 [2024-07-25 13:37:03.435480] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064067 ] 00:28:25.281 Using job config with 4 jobs 00:28:25.281 [2024-07-25 13:37:03.547990] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:25.281 [2024-07-25 13:37:03.624014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:25.281 cpumask for '\''job0'\'' is too big 00:28:25.281 cpumask for '\''job1'\'' is too big 00:28:25.281 cpumask for '\''job2'\'' is too big 00:28:25.281 cpumask for '\''job3'\'' is too big 00:28:25.281 Running I/O for 2 seconds... 00:28:25.281 00:28:25.281 Latency(us) 00:28:25.281 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28296.17 27.63 0.00 0.00 9039.19 1600.59 13913.80 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28274.03 27.61 0.00 0.00 9029.11 1575.38 12300.60 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28251.85 27.59 0.00 0.00 9018.71 1575.38 10737.82 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28229.79 27.57 0.00 0.00 9008.59 1575.38 9275.86 00:28:25.281 =================================================================================================================== 00:28:25.281 Total : 113051.83 110.40 0.00 0.00 9023.90 1575.38 13913.80' 00:28:25.281 13:37:05 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 13:37:03.435418] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:25.281 [2024-07-25 13:37:03.435480] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064067 ] 00:28:25.281 Using job config with 4 jobs 00:28:25.281 [2024-07-25 13:37:03.547990] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:25.281 [2024-07-25 13:37:03.624014] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:25.281 cpumask for '\''job0'\'' is too big 00:28:25.281 cpumask for '\''job1'\'' is too big 00:28:25.281 cpumask for '\''job2'\'' is too big 00:28:25.281 cpumask for '\''job3'\'' is too big 00:28:25.281 Running I/O for 2 seconds... 00:28:25.281 00:28:25.281 Latency(us) 00:28:25.281 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28296.17 27.63 0.00 0.00 9039.19 1600.59 13913.80 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28274.03 27.61 0.00 0.00 9029.11 1575.38 12300.60 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28251.85 27.59 0.00 0.00 9018.71 1575.38 10737.82 00:28:25.281 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:25.281 Malloc0 : 2.02 28229.79 27.57 0.00 0.00 9008.59 1575.38 9275.86 00:28:25.281 =================================================================================================================== 00:28:25.281 Total : 113051.83 110.40 0.00 0.00 9023.90 1575.38 13913.80' 00:28:25.281 13:37:05 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:25.281 13:37:05 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:25.281 13:37:05 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:28:25.281 13:37:05 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:25.281 [2024-07-25 13:37:05.958457] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:25.281 [2024-07-25 13:37:05.958512] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064384 ] 00:28:25.281 [2024-07-25 13:37:06.056734] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:25.541 [2024-07-25 13:37:06.133519] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:25.541 cpumask for 'job0' is too big 00:28:25.541 cpumask for 'job1' is too big 00:28:25.541 cpumask for 'job2' is too big 00:28:25.541 cpumask for 'job3' is too big 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:28:28.082 Running I/O for 2 seconds... 00:28:28.082 00:28:28.082 Latency(us) 00:28:28.082 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:28.082 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:28.082 Malloc0 : 2.01 27992.97 27.34 0.00 0.00 9140.33 1613.19 14216.27 00:28:28.082 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:28.082 Malloc0 : 2.01 27970.81 27.32 0.00 0.00 9130.01 1587.99 12603.08 00:28:28.082 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:28.082 Malloc0 : 2.02 28011.69 27.36 0.00 0.00 9099.04 1594.29 11040.30 00:28:28.082 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:28.082 Malloc0 : 2.02 27989.59 27.33 0.00 0.00 9088.47 1587.99 9427.10 00:28:28.082 =================================================================================================================== 00:28:28.082 Total : 111965.05 109.34 0.00 0.00 9114.42 1587.99 14216.27' 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:28.082 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:28.082 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:28.082 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:28.082 13:37:08 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:30.620 13:37:10 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-07-25 13:37:08.470877] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:30.620 [2024-07-25 13:37:08.470936] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064825 ] 00:28:30.620 Using job config with 3 jobs 00:28:30.620 [2024-07-25 13:37:08.577644] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:30.620 [2024-07-25 13:37:08.669188] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:30.620 cpumask for '\''job0'\'' is too big 00:28:30.620 cpumask for '\''job1'\'' is too big 00:28:30.620 cpumask for '\''job2'\'' is too big 00:28:30.620 Running I/O for 2 seconds... 00:28:30.620 00:28:30.620 Latency(us) 00:28:30.620 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:30.620 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.620 Malloc0 : 2.01 37852.18 36.97 0.00 0.00 6752.02 1594.29 10032.05 00:28:30.620 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.620 Malloc0 : 2.01 37864.15 36.98 0.00 0.00 6736.79 1575.38 8469.27 00:28:30.621 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.621 Malloc0 : 2.02 37834.49 36.95 0.00 0.00 6729.68 1587.99 7208.96 00:28:30.621 =================================================================================================================== 00:28:30.621 Total : 113550.82 110.89 0.00 0.00 6739.48 1575.38 10032.05' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-07-25 13:37:08.470877] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:30.621 [2024-07-25 13:37:08.470936] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064825 ] 00:28:30.621 Using job config with 3 jobs 00:28:30.621 [2024-07-25 13:37:08.577644] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:30.621 [2024-07-25 13:37:08.669188] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:30.621 cpumask for '\''job0'\'' is too big 00:28:30.621 cpumask for '\''job1'\'' is too big 00:28:30.621 cpumask for '\''job2'\'' is too big 00:28:30.621 Running I/O for 2 seconds... 00:28:30.621 00:28:30.621 Latency(us) 00:28:30.621 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:30.621 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.621 Malloc0 : 2.01 37852.18 36.97 0.00 0.00 6752.02 1594.29 10032.05 00:28:30.621 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.621 Malloc0 : 2.01 37864.15 36.98 0.00 0.00 6736.79 1575.38 8469.27 00:28:30.621 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.621 Malloc0 : 2.02 37834.49 36.95 0.00 0.00 6729.68 1587.99 7208.96 00:28:30.621 =================================================================================================================== 00:28:30.621 Total : 113550.82 110.89 0.00 0.00 6739.48 1575.38 10032.05' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 13:37:08.470877] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:30.621 [2024-07-25 13:37:08.470936] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1064825 ] 00:28:30.621 Using job config with 3 jobs 00:28:30.621 [2024-07-25 13:37:08.577644] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:30.621 [2024-07-25 13:37:08.669188] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:30.621 cpumask for '\''job0'\'' is too big 00:28:30.621 cpumask for '\''job1'\'' is too big 00:28:30.621 cpumask for '\''job2'\'' is too big 00:28:30.621 Running I/O for 2 seconds... 00:28:30.621 00:28:30.621 Latency(us) 00:28:30.621 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:30.621 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.621 Malloc0 : 2.01 37852.18 36.97 0.00 0.00 6752.02 1594.29 10032.05 00:28:30.621 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.621 Malloc0 : 2.01 37864.15 36.98 0.00 0.00 6736.79 1575.38 8469.27 00:28:30.621 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:30.621 Malloc0 : 2.02 37834.49 36.95 0.00 0.00 6729.68 1587.99 7208.96 00:28:30.621 =================================================================================================================== 00:28:30.621 Total : 113550.82 110.89 0.00 0.00 6739.48 1575.38 10032.05' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:30.621 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:30.621 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:30.621 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:30.621 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:30.621 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:30.621 13:37:10 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:33.250 13:37:13 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-07-25 13:37:11.026922] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:33.250 [2024-07-25 13:37:11.026979] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1065307 ] 00:28:33.250 Using job config with 4 jobs 00:28:33.250 [2024-07-25 13:37:11.125726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:33.250 [2024-07-25 13:37:11.198073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:33.250 cpumask for '\''job0'\'' is too big 00:28:33.250 cpumask for '\''job1'\'' is too big 00:28:33.250 cpumask for '\''job2'\'' is too big 00:28:33.250 cpumask for '\''job3'\'' is too big 00:28:33.250 Running I/O for 2 seconds... 00:28:33.250 00:28:33.250 Latency(us) 00:28:33.250 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.03 13867.13 13.54 0.00 0.00 18457.28 3251.59 28432.54 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.03 13856.05 13.53 0.00 0.00 18455.85 4007.78 28432.54 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.03 13845.29 13.52 0.00 0.00 18412.32 3251.59 25206.15 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.04 13834.32 13.51 0.00 0.00 18413.42 3957.37 25206.15 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.04 13823.59 13.50 0.00 0.00 18370.79 3226.39 21979.77 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.04 13812.55 13.49 0.00 0.00 18370.06 3957.37 21878.94 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.04 13801.84 13.48 0.00 0.00 18329.26 3251.59 18753.38 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.04 13900.38 13.57 0.00 0.00 18183.90 857.01 18753.38 00:28:33.250 =================================================================================================================== 00:28:33.250 Total : 110741.17 108.15 0.00 0.00 18373.89 857.01 28432.54' 00:28:33.250 13:37:13 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-07-25 13:37:11.026922] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:33.250 [2024-07-25 13:37:11.026979] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1065307 ] 00:28:33.250 Using job config with 4 jobs 00:28:33.250 [2024-07-25 13:37:11.125726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:33.250 [2024-07-25 13:37:11.198073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:33.250 cpumask for '\''job0'\'' is too big 00:28:33.250 cpumask for '\''job1'\'' is too big 00:28:33.250 cpumask for '\''job2'\'' is too big 00:28:33.250 cpumask for '\''job3'\'' is too big 00:28:33.250 Running I/O for 2 seconds... 00:28:33.250 00:28:33.250 Latency(us) 00:28:33.250 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.03 13867.13 13.54 0.00 0.00 18457.28 3251.59 28432.54 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.03 13856.05 13.53 0.00 0.00 18455.85 4007.78 28432.54 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.03 13845.29 13.52 0.00 0.00 18412.32 3251.59 25206.15 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.04 13834.32 13.51 0.00 0.00 18413.42 3957.37 25206.15 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.04 13823.59 13.50 0.00 0.00 18370.79 3226.39 21979.77 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.04 13812.55 13.49 0.00 0.00 18370.06 3957.37 21878.94 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.04 13801.84 13.48 0.00 0.00 18329.26 3251.59 18753.38 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.04 13900.38 13.57 0.00 0.00 18183.90 857.01 18753.38 00:28:33.250 =================================================================================================================== 00:28:33.250 Total : 110741.17 108.15 0.00 0.00 18373.89 857.01 28432.54' 00:28:33.250 13:37:13 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-07-25 13:37:11.026922] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:33.250 [2024-07-25 13:37:11.026979] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1065307 ] 00:28:33.250 Using job config with 4 jobs 00:28:33.250 [2024-07-25 13:37:11.125726] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:33.250 [2024-07-25 13:37:11.198073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:33.250 cpumask for '\''job0'\'' is too big 00:28:33.250 cpumask for '\''job1'\'' is too big 00:28:33.250 cpumask for '\''job2'\'' is too big 00:28:33.250 cpumask for '\''job3'\'' is too big 00:28:33.250 Running I/O for 2 seconds... 00:28:33.250 00:28:33.250 Latency(us) 00:28:33.250 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.03 13867.13 13.54 0.00 0.00 18457.28 3251.59 28432.54 00:28:33.250 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc1 : 2.03 13856.05 13.53 0.00 0.00 18455.85 4007.78 28432.54 00:28:33.250 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.250 Malloc0 : 2.03 13845.29 13.52 0.00 0.00 18412.32 3251.59 25206.15 00:28:33.251 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.251 Malloc1 : 2.04 13834.32 13.51 0.00 0.00 18413.42 3957.37 25206.15 00:28:33.251 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.251 Malloc0 : 2.04 13823.59 13.50 0.00 0.00 18370.79 3226.39 21979.77 00:28:33.251 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.251 Malloc1 : 2.04 13812.55 13.49 0.00 0.00 18370.06 3957.37 21878.94 00:28:33.251 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.251 Malloc0 : 2.04 13801.84 13.48 0.00 0.00 18329.26 3251.59 18753.38 00:28:33.251 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:33.251 Malloc1 : 2.04 13900.38 13.57 0.00 0.00 18183.90 857.01 18753.38 00:28:33.251 =================================================================================================================== 00:28:33.251 Total : 110741.17 108.15 0.00 0.00 18373.89 857.01 28432.54' 00:28:33.251 13:37:13 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:33.251 13:37:13 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:33.251 13:37:13 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:28:33.251 13:37:13 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:28:33.251 13:37:13 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:33.251 13:37:13 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:28:33.251 00:28:33.251 real 0m10.258s 00:28:33.251 user 0m9.320s 00:28:33.251 sys 0m0.801s 00:28:33.251 13:37:13 bdevperf_config -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:33.251 13:37:13 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:28:33.251 ************************************ 00:28:33.251 END TEST bdevperf_config 00:28:33.251 ************************************ 00:28:33.251 13:37:13 -- spdk/autotest.sh@196 -- # uname -s 00:28:33.251 13:37:13 -- spdk/autotest.sh@196 -- # [[ Linux == Linux ]] 00:28:33.251 13:37:13 -- spdk/autotest.sh@197 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:33.251 13:37:13 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:28:33.251 13:37:13 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:33.251 13:37:13 -- common/autotest_common.sh@10 -- # set +x 00:28:33.251 ************************************ 00:28:33.251 START TEST reactor_set_interrupt 00:28:33.251 ************************************ 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:33.251 * Looking for test storage... 00:28:33.251 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.251 13:37:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:28:33.251 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:33.251 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.251 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.251 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:28:33.251 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:33.251 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:28:33.251 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:33.251 13:37:13 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:28:33.252 13:37:13 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:28:33.252 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:28:33.252 #define SPDK_CONFIG_H 00:28:33.252 #define SPDK_CONFIG_APPS 1 00:28:33.252 #define SPDK_CONFIG_ARCH native 00:28:33.252 #undef SPDK_CONFIG_ASAN 00:28:33.252 #undef SPDK_CONFIG_AVAHI 00:28:33.252 #undef SPDK_CONFIG_CET 00:28:33.252 #define SPDK_CONFIG_COVERAGE 1 00:28:33.252 #define SPDK_CONFIG_CROSS_PREFIX 00:28:33.252 #define SPDK_CONFIG_CRYPTO 1 00:28:33.252 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:28:33.252 #undef SPDK_CONFIG_CUSTOMOCF 00:28:33.252 #undef SPDK_CONFIG_DAOS 00:28:33.252 #define SPDK_CONFIG_DAOS_DIR 00:28:33.252 #define SPDK_CONFIG_DEBUG 1 00:28:33.252 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:28:33.252 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:33.252 #define SPDK_CONFIG_DPDK_INC_DIR 00:28:33.252 #define SPDK_CONFIG_DPDK_LIB_DIR 00:28:33.252 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:28:33.252 #undef SPDK_CONFIG_DPDK_UADK 00:28:33.252 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:33.252 #define SPDK_CONFIG_EXAMPLES 1 00:28:33.252 #undef SPDK_CONFIG_FC 00:28:33.252 #define SPDK_CONFIG_FC_PATH 00:28:33.252 #define SPDK_CONFIG_FIO_PLUGIN 1 00:28:33.252 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:28:33.252 #undef SPDK_CONFIG_FUSE 00:28:33.252 #undef SPDK_CONFIG_FUZZER 00:28:33.252 #define SPDK_CONFIG_FUZZER_LIB 00:28:33.252 #undef SPDK_CONFIG_GOLANG 00:28:33.252 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:28:33.252 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:28:33.252 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:28:33.252 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:28:33.252 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:28:33.252 #undef SPDK_CONFIG_HAVE_LIBBSD 00:28:33.252 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:28:33.252 #define SPDK_CONFIG_IDXD 1 00:28:33.252 #define SPDK_CONFIG_IDXD_KERNEL 1 00:28:33.252 #define SPDK_CONFIG_IPSEC_MB 1 00:28:33.252 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:33.252 #define SPDK_CONFIG_ISAL 1 00:28:33.252 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:28:33.252 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:28:33.252 #define SPDK_CONFIG_LIBDIR 00:28:33.252 #undef SPDK_CONFIG_LTO 00:28:33.252 #define SPDK_CONFIG_MAX_LCORES 128 00:28:33.252 #define SPDK_CONFIG_NVME_CUSE 1 00:28:33.252 #undef SPDK_CONFIG_OCF 00:28:33.252 #define SPDK_CONFIG_OCF_PATH 00:28:33.252 #define SPDK_CONFIG_OPENSSL_PATH 00:28:33.252 #undef SPDK_CONFIG_PGO_CAPTURE 00:28:33.252 #define SPDK_CONFIG_PGO_DIR 00:28:33.252 #undef SPDK_CONFIG_PGO_USE 00:28:33.252 #define SPDK_CONFIG_PREFIX /usr/local 00:28:33.252 #undef SPDK_CONFIG_RAID5F 00:28:33.252 #undef SPDK_CONFIG_RBD 00:28:33.252 #define SPDK_CONFIG_RDMA 1 00:28:33.252 #define SPDK_CONFIG_RDMA_PROV verbs 00:28:33.252 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:28:33.252 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:28:33.252 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:28:33.252 #define SPDK_CONFIG_SHARED 1 00:28:33.252 #undef SPDK_CONFIG_SMA 00:28:33.252 #define SPDK_CONFIG_TESTS 1 00:28:33.252 #undef SPDK_CONFIG_TSAN 00:28:33.252 #define SPDK_CONFIG_UBLK 1 00:28:33.252 #define SPDK_CONFIG_UBSAN 1 00:28:33.252 #undef SPDK_CONFIG_UNIT_TESTS 00:28:33.252 #undef SPDK_CONFIG_URING 00:28:33.252 #define SPDK_CONFIG_URING_PATH 00:28:33.252 #undef SPDK_CONFIG_URING_ZNS 00:28:33.252 #undef SPDK_CONFIG_USDT 00:28:33.252 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:28:33.252 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:28:33.252 #undef SPDK_CONFIG_VFIO_USER 00:28:33.252 #define SPDK_CONFIG_VFIO_USER_DIR 00:28:33.252 #define SPDK_CONFIG_VHOST 1 00:28:33.252 #define SPDK_CONFIG_VIRTIO 1 00:28:33.252 #undef SPDK_CONFIG_VTUNE 00:28:33.252 #define SPDK_CONFIG_VTUNE_DIR 00:28:33.252 #define SPDK_CONFIG_WERROR 1 00:28:33.252 #define SPDK_CONFIG_WPDK_DIR 00:28:33.252 #undef SPDK_CONFIG_XNVME 00:28:33.252 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:28:33.252 13:37:13 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:28:33.252 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:33.252 13:37:13 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:33.252 13:37:13 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:33.252 13:37:13 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:33.252 13:37:13 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:33.252 13:37:13 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:33.252 13:37:13 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:33.252 13:37:13 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:28:33.252 13:37:13 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:33.252 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:33.252 13:37:13 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:28:33.253 13:37:13 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:28:33.253 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 1 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@166 -- # : 0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@173 -- # : 0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@202 -- # cat 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@265 -- # export valgrind= 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@265 -- # valgrind= 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@271 -- # uname -s 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@281 -- # MAKE=make 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j128 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@301 -- # TEST_MODE= 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@320 -- # [[ -z 1065674 ]] 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@320 -- # kill -0 1065674 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local mount target_dir 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.xFz0XY 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.xFz0XY/tests/interrupt /tmp/spdk.xFz0XY 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@329 -- # df -T 00:28:33.254 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=954712064 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4329717760 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=123359281152 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=129376288768 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=6017007616 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=64683433984 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=64688144384 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4710400 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=25865363456 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=25875259392 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=9895936 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=efivarfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=efivarfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=339968 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=507904 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=163840 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=64687423488 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=64688144384 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=720896 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # avails["$mount"]=12937621504 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@364 -- # sizes["$mount"]=12937625600 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:28:33.255 * Looking for test storage... 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@370 -- # local target_space new_size 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@374 -- # mount=/ 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@376 -- # target_space=123359281152 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@383 -- # new_size=8231600128 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.255 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@391 -- # return 0 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # set -o errtrace 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1687 -- # true 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@1689 -- # xtrace_fd 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:28:33.255 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:28:33.255 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:28:33.255 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:33.255 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:28:33.255 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:28:33.255 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:28:33.255 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:28:33.255 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1065721 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1065721 /var/tmp/spdk.sock 00:28:33.256 13:37:13 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:33.256 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1065721 ']' 00:28:33.256 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:33.256 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:33.256 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:33.256 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:33.256 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:33.256 13:37:13 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:33.256 [2024-07-25 13:37:13.909750] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:33.256 [2024-07-25 13:37:13.909811] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1065721 ] 00:28:33.256 [2024-07-25 13:37:13.981710] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:33.516 [2024-07-25 13:37:14.047494] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:33.516 [2024-07-25 13:37:14.047642] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:33.516 [2024-07-25 13:37:14.047643] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:33.516 [2024-07-25 13:37:14.097476] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:33.516 13:37:14 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:33.516 13:37:14 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:28:33.516 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:28:33.516 13:37:14 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:33.776 Malloc0 00:28:33.776 Malloc1 00:28:33.776 Malloc2 00:28:33.776 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:28:33.776 13:37:14 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:28:33.776 13:37:14 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:33.776 13:37:14 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:33.776 5000+0 records in 00:28:33.776 5000+0 records out 00:28:33.776 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0185745 s, 551 MB/s 00:28:33.776 13:37:14 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:34.036 AIO0 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 1065721 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 1065721 without_thd 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1065721 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:34.036 13:37:14 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:28:34.297 spdk_thread ids are 1 on reactor0. 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1065721 0 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1065721 0 idle 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1065721 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1065721 -w 256 00:28:34.297 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1065721 root 20 0 128.2g 34816 22528 S 6.7 0.0 0:00.27 reactor_0' 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1065721 root 20 0 128.2g 34816 22528 S 6.7 0.0 0:00.27 reactor_0 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=6.7 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=6 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 6 -gt 30 ]] 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1065721 1 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1065721 1 idle 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1065721 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1065721 -w 256 00:28:34.557 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1065755 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_1' 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1065755 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_1 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1065721 2 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1065721 2 idle 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1065721 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1065721 -w 256 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:34.816 13:37:15 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1065756 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_2' 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1065756 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.00 reactor_2 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:28:34.817 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:28:35.076 [2024-07-25 13:37:15.724611] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:35.076 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:28:35.337 [2024-07-25 13:37:15.920157] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:28:35.337 [2024-07-25 13:37:15.920529] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:35.337 13:37:15 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:28:35.337 [2024-07-25 13:37:16.120128] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:28:35.337 [2024-07-25 13:37:16.120539] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1065721 0 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1065721 0 busy 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1065721 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1065721 -w 256 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1065721 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.65 reactor_0' 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1065721 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.65 reactor_0 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1065721 2 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1065721 2 busy 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1065721 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1065721 -w 256 00:28:35.597 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1065756 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.35 reactor_2' 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1065756 root 20 0 128.2g 34816 22528 R 99.9 0.0 0:00.35 reactor_2 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:35.857 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:28:36.116 [2024-07-25 13:37:16.668120] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:28:36.116 [2024-07-25 13:37:16.668229] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1065721 2 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1065721 2 idle 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1065721 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1065721 -w 256 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:36.116 13:37:16 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1065756 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.54 reactor_2' 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1065756 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:00.54 reactor_2 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:36.117 13:37:16 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:28:36.377 [2024-07-25 13:37:17.056115] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:28:36.377 [2024-07-25 13:37:17.056315] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:36.377 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:28:36.377 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:28:36.377 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:28:36.636 [2024-07-25 13:37:17.268384] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1065721 0 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1065721 0 idle 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1065721 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1065721 -w 256 00:28:36.636 13:37:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:36.896 13:37:17 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1065721 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:01.41 reactor_0' 00:28:36.896 13:37:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1065721 root 20 0 128.2g 34816 22528 S 0.0 0.0 0:01.41 reactor_0 00:28:36.896 13:37:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 1065721 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1065721 ']' 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1065721 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1065721 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1065721' 00:28:36.897 killing process with pid 1065721 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1065721 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1065721 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1066410 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1066410 /var/tmp/spdk.sock 00:28:36.897 13:37:17 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@831 -- # '[' -z 1066410 ']' 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:36.897 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:36.897 13:37:17 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:37.158 [2024-07-25 13:37:17.705194] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:37.158 [2024-07-25 13:37:17.705251] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1066410 ] 00:28:37.158 [2024-07-25 13:37:17.791141] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:37.158 [2024-07-25 13:37:17.861447] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:37.158 [2024-07-25 13:37:17.861598] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:37.158 [2024-07-25 13:37:17.861600] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:37.158 [2024-07-25 13:37:17.912526] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:38.097 13:37:18 reactor_set_interrupt -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:38.097 13:37:18 reactor_set_interrupt -- common/autotest_common.sh@864 -- # return 0 00:28:38.097 13:37:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:28:38.097 13:37:18 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:38.097 Malloc0 00:28:38.097 Malloc1 00:28:38.097 Malloc2 00:28:38.097 13:37:18 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:28:38.097 13:37:18 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:28:38.097 13:37:18 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:38.097 13:37:18 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:38.357 5000+0 records in 00:28:38.357 5000+0 records out 00:28:38.357 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0165437 s, 619 MB/s 00:28:38.357 13:37:18 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:38.357 AIO0 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 1066410 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 1066410 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=1066410 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:38.357 13:37:19 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:38.619 13:37:19 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:28:38.881 spdk_thread ids are 1 on reactor0. 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1066410 0 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1066410 0 idle 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1066410 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1066410 -w 256 00:28:38.881 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1066410 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.30 reactor_0' 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1066410 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.30 reactor_0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1066410 1 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1066410 1 idle 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1066410 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1066410 -w 256 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1066447 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_1' 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1066447 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_1 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 1066410 2 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1066410 2 idle 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1066410 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1066410 -w 256 00:28:39.142 13:37:19 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:39.403 13:37:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1066448 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_2' 00:28:39.403 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1066448 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.00 reactor_2 00:28:39.403 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:39.403 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:39.403 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:39.404 13:37:20 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:39.404 13:37:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:39.404 13:37:20 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:39.404 13:37:20 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:39.404 13:37:20 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:39.404 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:28:39.404 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:28:39.664 [2024-07-25 13:37:20.238193] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:28:39.664 [2024-07-25 13:37:20.238405] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:28:39.664 [2024-07-25 13:37:20.238706] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:39.664 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:28:39.664 [2024-07-25 13:37:20.442536] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:28:39.664 [2024-07-25 13:37:20.442832] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1066410 0 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1066410 0 busy 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1066410 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1066410 -w 256 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1066410 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.71 reactor_0' 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1066410 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.71 reactor_0 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 1066410 2 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 1066410 2 busy 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1066410 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1066410 -w 256 00:28:39.924 13:37:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1066448 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.36 reactor_2' 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1066448 root 20 0 128.2g 35840 23552 R 99.9 0.0 0:00.36 reactor_2 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:40.185 13:37:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:28:40.445 [2024-07-25 13:37:21.012035] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:28:40.445 [2024-07-25 13:37:21.012235] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 1066410 2 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1066410 2 idle 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1066410 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1066410 -w 256 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1066448 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.56 reactor_2' 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1066448 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:00.56 reactor_2 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:40.445 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:28:40.705 [2024-07-25 13:37:21.409044] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:28:40.705 [2024-07-25 13:37:21.409361] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:28:40.705 [2024-07-25 13:37:21.409381] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 1066410 0 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 1066410 0 idle 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=1066410 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 1066410 -w 256 00:28:40.705 13:37:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='1066410 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:01.48 reactor_0' 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 1066410 root 20 0 128.2g 35840 23552 S 0.0 0.0 0:01.48 reactor_0 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:28:40.964 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 1066410 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@950 -- # '[' -z 1066410 ']' 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@954 -- # kill -0 1066410 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@955 -- # uname 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1066410 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1066410' 00:28:40.964 killing process with pid 1066410 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@969 -- # kill 1066410 00:28:40.964 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@974 -- # wait 1066410 00:28:41.225 13:37:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:28:41.225 13:37:21 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:41.225 00:28:41.225 real 0m8.252s 00:28:41.225 user 0m8.155s 00:28:41.225 sys 0m1.587s 00:28:41.225 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:41.225 13:37:21 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:41.225 ************************************ 00:28:41.225 END TEST reactor_set_interrupt 00:28:41.225 ************************************ 00:28:41.225 13:37:21 -- spdk/autotest.sh@198 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:41.225 13:37:21 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:28:41.225 13:37:21 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:41.225 13:37:21 -- common/autotest_common.sh@10 -- # set +x 00:28:41.225 ************************************ 00:28:41.225 START TEST reap_unregistered_poller 00:28:41.225 ************************************ 00:28:41.225 13:37:21 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:41.225 * Looking for test storage... 00:28:41.487 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.487 13:37:22 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:28:41.487 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:41.487 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.487 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.487 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:28:41.487 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:41.487 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:28:41.487 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES=128 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:28:41.487 13:37:22 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:28:41.488 13:37:22 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:28:41.488 13:37:22 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:28:41.488 13:37:22 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:28:41.488 13:37:22 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:28:41.488 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:28:41.488 #define SPDK_CONFIG_H 00:28:41.488 #define SPDK_CONFIG_APPS 1 00:28:41.488 #define SPDK_CONFIG_ARCH native 00:28:41.488 #undef SPDK_CONFIG_ASAN 00:28:41.488 #undef SPDK_CONFIG_AVAHI 00:28:41.488 #undef SPDK_CONFIG_CET 00:28:41.488 #define SPDK_CONFIG_COVERAGE 1 00:28:41.488 #define SPDK_CONFIG_CROSS_PREFIX 00:28:41.488 #define SPDK_CONFIG_CRYPTO 1 00:28:41.488 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:28:41.488 #undef SPDK_CONFIG_CUSTOMOCF 00:28:41.488 #undef SPDK_CONFIG_DAOS 00:28:41.488 #define SPDK_CONFIG_DAOS_DIR 00:28:41.488 #define SPDK_CONFIG_DEBUG 1 00:28:41.488 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:28:41.488 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:41.488 #define SPDK_CONFIG_DPDK_INC_DIR 00:28:41.488 #define SPDK_CONFIG_DPDK_LIB_DIR 00:28:41.488 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:28:41.488 #undef SPDK_CONFIG_DPDK_UADK 00:28:41.488 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:41.488 #define SPDK_CONFIG_EXAMPLES 1 00:28:41.488 #undef SPDK_CONFIG_FC 00:28:41.488 #define SPDK_CONFIG_FC_PATH 00:28:41.488 #define SPDK_CONFIG_FIO_PLUGIN 1 00:28:41.488 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:28:41.488 #undef SPDK_CONFIG_FUSE 00:28:41.488 #undef SPDK_CONFIG_FUZZER 00:28:41.488 #define SPDK_CONFIG_FUZZER_LIB 00:28:41.488 #undef SPDK_CONFIG_GOLANG 00:28:41.488 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:28:41.488 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:28:41.488 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:28:41.488 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:28:41.488 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:28:41.488 #undef SPDK_CONFIG_HAVE_LIBBSD 00:28:41.488 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:28:41.488 #define SPDK_CONFIG_IDXD 1 00:28:41.488 #define SPDK_CONFIG_IDXD_KERNEL 1 00:28:41.488 #define SPDK_CONFIG_IPSEC_MB 1 00:28:41.488 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:41.488 #define SPDK_CONFIG_ISAL 1 00:28:41.488 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:28:41.488 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:28:41.488 #define SPDK_CONFIG_LIBDIR 00:28:41.488 #undef SPDK_CONFIG_LTO 00:28:41.488 #define SPDK_CONFIG_MAX_LCORES 128 00:28:41.488 #define SPDK_CONFIG_NVME_CUSE 1 00:28:41.488 #undef SPDK_CONFIG_OCF 00:28:41.488 #define SPDK_CONFIG_OCF_PATH 00:28:41.488 #define SPDK_CONFIG_OPENSSL_PATH 00:28:41.488 #undef SPDK_CONFIG_PGO_CAPTURE 00:28:41.488 #define SPDK_CONFIG_PGO_DIR 00:28:41.488 #undef SPDK_CONFIG_PGO_USE 00:28:41.488 #define SPDK_CONFIG_PREFIX /usr/local 00:28:41.488 #undef SPDK_CONFIG_RAID5F 00:28:41.488 #undef SPDK_CONFIG_RBD 00:28:41.488 #define SPDK_CONFIG_RDMA 1 00:28:41.488 #define SPDK_CONFIG_RDMA_PROV verbs 00:28:41.488 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:28:41.488 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:28:41.488 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:28:41.488 #define SPDK_CONFIG_SHARED 1 00:28:41.488 #undef SPDK_CONFIG_SMA 00:28:41.488 #define SPDK_CONFIG_TESTS 1 00:28:41.488 #undef SPDK_CONFIG_TSAN 00:28:41.488 #define SPDK_CONFIG_UBLK 1 00:28:41.488 #define SPDK_CONFIG_UBSAN 1 00:28:41.488 #undef SPDK_CONFIG_UNIT_TESTS 00:28:41.488 #undef SPDK_CONFIG_URING 00:28:41.488 #define SPDK_CONFIG_URING_PATH 00:28:41.488 #undef SPDK_CONFIG_URING_ZNS 00:28:41.488 #undef SPDK_CONFIG_USDT 00:28:41.488 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:28:41.488 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:28:41.488 #undef SPDK_CONFIG_VFIO_USER 00:28:41.488 #define SPDK_CONFIG_VFIO_USER_DIR 00:28:41.488 #define SPDK_CONFIG_VHOST 1 00:28:41.488 #define SPDK_CONFIG_VIRTIO 1 00:28:41.488 #undef SPDK_CONFIG_VTUNE 00:28:41.488 #define SPDK_CONFIG_VTUNE_DIR 00:28:41.488 #define SPDK_CONFIG_WERROR 1 00:28:41.488 #define SPDK_CONFIG_WPDK_DIR 00:28:41.488 #undef SPDK_CONFIG_XNVME 00:28:41.488 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:28:41.488 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:41.488 13:37:22 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:41.488 13:37:22 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:41.488 13:37:22 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:41.488 13:37:22 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:41.488 13:37:22 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:28:41.488 13:37:22 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:41.488 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:28:41.488 13:37:22 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:28:41.488 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:28:41.488 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 1 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_DSA 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@166 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@167 -- # export SPDK_TEST_ACCEL_IAA 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_FUZZER_TARGET 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_TEST_NVMF_MDNS 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@173 -- # : 0 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@174 -- # export SPDK_JSONRPC_GO_CLIENT 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@177 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@178 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@179 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@179 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:41.489 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@180 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@180 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@183 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@183 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@187 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@187 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@191 -- # export PYTHONDONTWRITEBYTECODE=1 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@191 -- # PYTHONDONTWRITEBYTECODE=1 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@195 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@195 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@196 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@196 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@200 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@201 -- # rm -rf /var/tmp/asan_suppression_file 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@202 -- # cat 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@238 -- # echo leak:libfuse3.so 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@240 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@242 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@242 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@244 -- # '[' -z /var/spdk/dependencies ']' 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@247 -- # export DEPENDENCY_DIR 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@251 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@251 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@252 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@252 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@255 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@255 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@256 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@258 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@258 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@261 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@261 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@264 -- # '[' 0 -eq 0 ']' 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@265 -- # export valgrind= 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@265 -- # valgrind= 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@271 -- # uname -s 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@271 -- # '[' Linux = Linux ']' 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@272 -- # HUGEMEM=4096 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@273 -- # export CLEAR_HUGE=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@273 -- # CLEAR_HUGE=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@274 -- # [[ 1 -eq 1 ]] 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@278 -- # export HUGE_EVEN_ALLOC=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@278 -- # HUGE_EVEN_ALLOC=yes 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@281 -- # MAKE=make 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@282 -- # MAKEFLAGS=-j128 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@298 -- # export HUGEMEM=4096 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@298 -- # HUGEMEM=4096 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@300 -- # NO_HUGE=() 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@301 -- # TEST_MODE= 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@320 -- # [[ -z 1067382 ]] 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@320 -- # kill -0 1067382 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@1680 -- # set_test_storage 2147483648 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@330 -- # [[ -v testdir ]] 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@332 -- # local requested_size=2147483648 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local mount target_dir 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@335 -- # local -A mounts fss sizes avails uses 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local source fs size avail mount use 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@338 -- # local storage_fallback storage_candidates 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@340 -- # mktemp -udt spdk.XXXXXX 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@340 -- # storage_fallback=/tmp/spdk.I5ZQnz 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@345 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@347 -- # [[ -n '' ]] 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@352 -- # [[ -n '' ]] 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@357 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.I5ZQnz/tests/interrupt /tmp/spdk.I5ZQnz 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@360 -- # requested_size=2214592512 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@329 -- # df -T 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@329 -- # grep -v Filesystem 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_devtmpfs 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=devtmpfs 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=67108864 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=67108864 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=0 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=/dev/pmem0 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=ext2 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=954712064 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=5284429824 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4329717760 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=spdk_root 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=overlay 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=123359125504 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=129376288768 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=6017163264 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=64683433984 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=64688144384 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4710400 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=25865363456 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=25875259392 00:28:41.490 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=9895936 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=efivarfs 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=efivarfs 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=339968 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=507904 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=163840 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=64687423488 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=64688144384 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=720896 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # mounts["$mount"]=tmpfs 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@363 -- # fss["$mount"]=tmpfs 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # avails["$mount"]=12937621504 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@364 -- # sizes["$mount"]=12937625600 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@365 -- # uses["$mount"]=4096 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@362 -- # read -r source fs size use avail _ mount 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@368 -- # printf '* Looking for test storage...\n' 00:28:41.491 * Looking for test storage... 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@370 -- # local target_space new_size 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@371 -- # for target_dir in "${storage_candidates[@]}" 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@374 -- # awk '$1 !~ /Filesystem/{print $6}' 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@374 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@374 -- # mount=/ 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@376 -- # target_space=123359125504 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@377 -- # (( target_space == 0 || target_space < requested_size )) 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@380 -- # (( target_space >= requested_size )) 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == tmpfs ]] 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ overlay == ramfs ]] 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@382 -- # [[ / == / ]] 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@383 -- # new_size=8231755776 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@384 -- # (( new_size * 100 / sizes[/] > 95 )) 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@389 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@389 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@390 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.491 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@391 -- # return 0 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # set -o errtrace 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # shopt -s extdebug 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@1684 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # PS4=' \t ${test_domain:-} -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@1687 -- # true 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@1689 -- # xtrace_fd 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=1067436 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 1067436 /var/tmp/spdk.sock 00:28:41.491 13:37:22 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@831 -- # '[' -z 1067436 ']' 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:41.491 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:41.491 13:37:22 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:41.491 [2024-07-25 13:37:22.232335] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:41.491 [2024-07-25 13:37:22.232399] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1067436 ] 00:28:41.752 [2024-07-25 13:37:22.325743] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:41.752 [2024-07-25 13:37:22.423530] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:41.752 [2024-07-25 13:37:22.423684] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:41.752 [2024-07-25 13:37:22.423850] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:28:41.752 [2024-07-25 13:37:22.495427] thread.c:2099:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:42.323 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:42.323 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@864 -- # return 0 00:28:42.323 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:28:42.323 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:42.323 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:28:42.323 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:42.584 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:28:42.584 "name": "app_thread", 00:28:42.584 "id": 1, 00:28:42.584 "active_pollers": [], 00:28:42.584 "timed_pollers": [ 00:28:42.584 { 00:28:42.584 "name": "rpc_subsystem_poll_servers", 00:28:42.584 "id": 1, 00:28:42.584 "state": "waiting", 00:28:42.584 "run_count": 0, 00:28:42.584 "busy_count": 0, 00:28:42.584 "period_ticks": 10400000 00:28:42.584 } 00:28:42.584 ], 00:28:42.584 "paused_pollers": [] 00:28:42.584 }' 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:42.584 5000+0 records in 00:28:42.584 5000+0 records out 00:28:42.584 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0168359 s, 608 MB/s 00:28:42.584 13:37:23 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:42.844 AIO0 00:28:42.844 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:43.104 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:28:43.104 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:28:43.104 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:28:43.104 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@561 -- # xtrace_disable 00:28:43.104 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:43.104 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:28:43.104 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:28:43.104 "name": "app_thread", 00:28:43.104 "id": 1, 00:28:43.104 "active_pollers": [], 00:28:43.104 "timed_pollers": [ 00:28:43.104 { 00:28:43.104 "name": "rpc_subsystem_poll_servers", 00:28:43.104 "id": 1, 00:28:43.104 "state": "waiting", 00:28:43.104 "run_count": 0, 00:28:43.104 "busy_count": 0, 00:28:43.104 "period_ticks": 10400000 00:28:43.104 } 00:28:43.104 ], 00:28:43.104 "paused_pollers": [] 00:28:43.104 }' 00:28:43.104 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:28:43.365 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:28:43.365 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:28:43.365 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:28:43.365 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:28:43.365 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:28:43.365 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:28:43.365 13:37:23 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 1067436 00:28:43.365 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@950 -- # '[' -z 1067436 ']' 00:28:43.365 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@954 -- # kill -0 1067436 00:28:43.365 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@955 -- # uname 00:28:43.365 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:43.365 13:37:23 reap_unregistered_poller -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1067436 00:28:43.365 13:37:24 reap_unregistered_poller -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:28:43.365 13:37:24 reap_unregistered_poller -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:28:43.365 13:37:24 reap_unregistered_poller -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1067436' 00:28:43.365 killing process with pid 1067436 00:28:43.365 13:37:24 reap_unregistered_poller -- common/autotest_common.sh@969 -- # kill 1067436 00:28:43.365 13:37:24 reap_unregistered_poller -- common/autotest_common.sh@974 -- # wait 1067436 00:28:43.625 13:37:24 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:28:43.625 13:37:24 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:43.625 00:28:43.625 real 0m2.266s 00:28:43.625 user 0m1.374s 00:28:43.625 sys 0m0.590s 00:28:43.625 13:37:24 reap_unregistered_poller -- common/autotest_common.sh@1126 -- # xtrace_disable 00:28:43.625 13:37:24 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:43.625 ************************************ 00:28:43.625 END TEST reap_unregistered_poller 00:28:43.625 ************************************ 00:28:43.625 13:37:24 -- spdk/autotest.sh@202 -- # uname -s 00:28:43.625 13:37:24 -- spdk/autotest.sh@202 -- # [[ Linux == Linux ]] 00:28:43.625 13:37:24 -- spdk/autotest.sh@203 -- # [[ 1 -eq 1 ]] 00:28:43.625 13:37:24 -- spdk/autotest.sh@209 -- # [[ 1 -eq 0 ]] 00:28:43.626 13:37:24 -- spdk/autotest.sh@215 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@260 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@264 -- # timing_exit lib 00:28:43.626 13:37:24 -- common/autotest_common.sh@730 -- # xtrace_disable 00:28:43.626 13:37:24 -- common/autotest_common.sh@10 -- # set +x 00:28:43.626 13:37:24 -- spdk/autotest.sh@266 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@274 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@283 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@320 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@325 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@334 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@347 -- # '[' 0 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@351 -- # '[' 1 -eq 1 ']' 00:28:43.626 13:37:24 -- spdk/autotest.sh@352 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:28:43.626 13:37:24 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:28:43.626 13:37:24 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:28:43.626 13:37:24 -- common/autotest_common.sh@10 -- # set +x 00:28:43.626 ************************************ 00:28:43.626 START TEST compress_compdev 00:28:43.626 ************************************ 00:28:43.626 13:37:24 compress_compdev -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:28:43.626 * Looking for test storage... 00:28:43.887 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:28:43.887 13:37:24 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:28:43.887 13:37:24 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:43.887 13:37:24 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:43.888 13:37:24 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:43.888 13:37:24 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:43.888 13:37:24 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:43.888 13:37:24 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:43.888 13:37:24 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:43.888 13:37:24 compress_compdev -- paths/export.sh@5 -- # export PATH 00:28:43.888 13:37:24 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:28:43.888 13:37:24 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1067840 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1067840 00:28:43.888 13:37:24 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:43.888 13:37:24 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1067840 ']' 00:28:43.888 13:37:24 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:43.888 13:37:24 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:43.888 13:37:24 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:43.888 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:43.888 13:37:24 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:43.888 13:37:24 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:43.888 [2024-07-25 13:37:24.519254] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:43.888 [2024-07-25 13:37:24.519319] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1067840 ] 00:28:43.888 [2024-07-25 13:37:24.611641] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:44.148 [2024-07-25 13:37:24.721312] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:44.148 [2024-07-25 13:37:24.721318] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:44.718 [2024-07-25 13:37:25.360376] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:44.718 13:37:25 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:44.718 13:37:25 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:28:44.718 13:37:25 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:28:44.718 13:37:25 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:44.718 13:37:25 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:48.018 [2024-07-25 13:37:28.488387] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2a646b0 PMD being used: compress_qat 00:28:48.018 13:37:28 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:48.018 13:37:28 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:28:48.018 13:37:28 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:48.018 13:37:28 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:28:48.018 13:37:28 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:48.018 13:37:28 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:48.018 13:37:28 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:48.018 13:37:28 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:48.279 [ 00:28:48.279 { 00:28:48.279 "name": "Nvme0n1", 00:28:48.279 "aliases": [ 00:28:48.279 "80eef522-5ddd-4e32-8103-af9da26b10c9" 00:28:48.279 ], 00:28:48.279 "product_name": "NVMe disk", 00:28:48.279 "block_size": 512, 00:28:48.279 "num_blocks": 3907029168, 00:28:48.279 "uuid": "80eef522-5ddd-4e32-8103-af9da26b10c9", 00:28:48.279 "assigned_rate_limits": { 00:28:48.279 "rw_ios_per_sec": 0, 00:28:48.279 "rw_mbytes_per_sec": 0, 00:28:48.279 "r_mbytes_per_sec": 0, 00:28:48.279 "w_mbytes_per_sec": 0 00:28:48.279 }, 00:28:48.279 "claimed": false, 00:28:48.279 "zoned": false, 00:28:48.279 "supported_io_types": { 00:28:48.279 "read": true, 00:28:48.279 "write": true, 00:28:48.279 "unmap": true, 00:28:48.279 "flush": true, 00:28:48.279 "reset": true, 00:28:48.279 "nvme_admin": true, 00:28:48.279 "nvme_io": true, 00:28:48.279 "nvme_io_md": false, 00:28:48.279 "write_zeroes": true, 00:28:48.279 "zcopy": false, 00:28:48.279 "get_zone_info": false, 00:28:48.279 "zone_management": false, 00:28:48.279 "zone_append": false, 00:28:48.279 "compare": false, 00:28:48.279 "compare_and_write": false, 00:28:48.279 "abort": true, 00:28:48.279 "seek_hole": false, 00:28:48.279 "seek_data": false, 00:28:48.279 "copy": false, 00:28:48.279 "nvme_iov_md": false 00:28:48.279 }, 00:28:48.279 "driver_specific": { 00:28:48.279 "nvme": [ 00:28:48.279 { 00:28:48.279 "pci_address": "0000:65:00.0", 00:28:48.279 "trid": { 00:28:48.279 "trtype": "PCIe", 00:28:48.279 "traddr": "0000:65:00.0" 00:28:48.279 }, 00:28:48.279 "ctrlr_data": { 00:28:48.279 "cntlid": 0, 00:28:48.279 "vendor_id": "0x8086", 00:28:48.279 "model_number": "INTEL SSDPE2KX020T8", 00:28:48.279 "serial_number": "PHLJ9512038S2P0BGN", 00:28:48.279 "firmware_revision": "VDV10184", 00:28:48.279 "oacs": { 00:28:48.279 "security": 0, 00:28:48.279 "format": 1, 00:28:48.279 "firmware": 1, 00:28:48.279 "ns_manage": 1 00:28:48.279 }, 00:28:48.279 "multi_ctrlr": false, 00:28:48.279 "ana_reporting": false 00:28:48.279 }, 00:28:48.279 "vs": { 00:28:48.279 "nvme_version": "1.2" 00:28:48.279 }, 00:28:48.279 "ns_data": { 00:28:48.279 "id": 1, 00:28:48.279 "can_share": false 00:28:48.279 } 00:28:48.279 } 00:28:48.279 ], 00:28:48.279 "mp_policy": "active_passive" 00:28:48.279 } 00:28:48.279 } 00:28:48.279 ] 00:28:48.279 13:37:28 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:28:48.279 13:37:28 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:48.540 [2024-07-25 13:37:29.122127] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x289b600 PMD being used: compress_qat 00:28:49.482 c10a420e-ec5c-4459-a061-84cdcade2b77 00:28:49.482 13:37:30 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:49.742 45ab4e75-5f6b-4fd7-ad68-776d422f7523 00:28:49.742 13:37:30 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:49.742 13:37:30 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:28:49.742 13:37:30 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:49.742 13:37:30 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:28:49.742 13:37:30 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:49.742 13:37:30 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:49.742 13:37:30 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:50.003 13:37:30 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:50.264 [ 00:28:50.264 { 00:28:50.264 "name": "45ab4e75-5f6b-4fd7-ad68-776d422f7523", 00:28:50.264 "aliases": [ 00:28:50.264 "lvs0/lv0" 00:28:50.264 ], 00:28:50.264 "product_name": "Logical Volume", 00:28:50.264 "block_size": 512, 00:28:50.264 "num_blocks": 204800, 00:28:50.264 "uuid": "45ab4e75-5f6b-4fd7-ad68-776d422f7523", 00:28:50.264 "assigned_rate_limits": { 00:28:50.264 "rw_ios_per_sec": 0, 00:28:50.264 "rw_mbytes_per_sec": 0, 00:28:50.264 "r_mbytes_per_sec": 0, 00:28:50.264 "w_mbytes_per_sec": 0 00:28:50.264 }, 00:28:50.264 "claimed": false, 00:28:50.264 "zoned": false, 00:28:50.264 "supported_io_types": { 00:28:50.264 "read": true, 00:28:50.264 "write": true, 00:28:50.264 "unmap": true, 00:28:50.264 "flush": false, 00:28:50.264 "reset": true, 00:28:50.264 "nvme_admin": false, 00:28:50.264 "nvme_io": false, 00:28:50.264 "nvme_io_md": false, 00:28:50.264 "write_zeroes": true, 00:28:50.264 "zcopy": false, 00:28:50.264 "get_zone_info": false, 00:28:50.264 "zone_management": false, 00:28:50.264 "zone_append": false, 00:28:50.264 "compare": false, 00:28:50.264 "compare_and_write": false, 00:28:50.264 "abort": false, 00:28:50.264 "seek_hole": true, 00:28:50.264 "seek_data": true, 00:28:50.264 "copy": false, 00:28:50.264 "nvme_iov_md": false 00:28:50.264 }, 00:28:50.264 "driver_specific": { 00:28:50.264 "lvol": { 00:28:50.264 "lvol_store_uuid": "c10a420e-ec5c-4459-a061-84cdcade2b77", 00:28:50.264 "base_bdev": "Nvme0n1", 00:28:50.264 "thin_provision": true, 00:28:50.264 "num_allocated_clusters": 0, 00:28:50.264 "snapshot": false, 00:28:50.264 "clone": false, 00:28:50.264 "esnap_clone": false 00:28:50.264 } 00:28:50.264 } 00:28:50.264 } 00:28:50.264 ] 00:28:50.264 13:37:30 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:28:50.264 13:37:30 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:50.264 13:37:30 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:50.264 [2024-07-25 13:37:31.034413] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:50.264 COMP_lvs0/lv0 00:28:50.525 13:37:31 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:50.525 13:37:31 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:28:50.525 13:37:31 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:28:50.525 13:37:31 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:28:50.525 13:37:31 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:28:50.525 13:37:31 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:28:50.525 13:37:31 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:50.525 13:37:31 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:50.787 [ 00:28:50.787 { 00:28:50.787 "name": "COMP_lvs0/lv0", 00:28:50.787 "aliases": [ 00:28:50.787 "443fda1a-e184-55df-a182-512afbdf65d2" 00:28:50.787 ], 00:28:50.787 "product_name": "compress", 00:28:50.787 "block_size": 512, 00:28:50.787 "num_blocks": 200704, 00:28:50.787 "uuid": "443fda1a-e184-55df-a182-512afbdf65d2", 00:28:50.787 "assigned_rate_limits": { 00:28:50.787 "rw_ios_per_sec": 0, 00:28:50.787 "rw_mbytes_per_sec": 0, 00:28:50.787 "r_mbytes_per_sec": 0, 00:28:50.787 "w_mbytes_per_sec": 0 00:28:50.787 }, 00:28:50.787 "claimed": false, 00:28:50.787 "zoned": false, 00:28:50.787 "supported_io_types": { 00:28:50.787 "read": true, 00:28:50.787 "write": true, 00:28:50.787 "unmap": false, 00:28:50.787 "flush": false, 00:28:50.787 "reset": false, 00:28:50.787 "nvme_admin": false, 00:28:50.787 "nvme_io": false, 00:28:50.787 "nvme_io_md": false, 00:28:50.787 "write_zeroes": true, 00:28:50.787 "zcopy": false, 00:28:50.787 "get_zone_info": false, 00:28:50.787 "zone_management": false, 00:28:50.787 "zone_append": false, 00:28:50.787 "compare": false, 00:28:50.787 "compare_and_write": false, 00:28:50.787 "abort": false, 00:28:50.787 "seek_hole": false, 00:28:50.787 "seek_data": false, 00:28:50.787 "copy": false, 00:28:50.787 "nvme_iov_md": false 00:28:50.787 }, 00:28:50.787 "driver_specific": { 00:28:50.787 "compress": { 00:28:50.787 "name": "COMP_lvs0/lv0", 00:28:50.787 "base_bdev_name": "45ab4e75-5f6b-4fd7-ad68-776d422f7523", 00:28:50.787 "pm_path": "/tmp/pmem/0a684acf-e09e-44cd-aa14-adc7599f6c5e" 00:28:50.787 } 00:28:50.787 } 00:28:50.787 } 00:28:50.787 ] 00:28:50.787 13:37:31 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:28:50.787 13:37:31 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:50.787 [2024-07-25 13:37:31.576167] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f809c1b15c0 PMD being used: compress_qat 00:28:51.048 [2024-07-25 13:37:31.579286] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2a91b30 PMD being used: compress_qat 00:28:51.048 Running I/O for 3 seconds... 00:28:54.349 00:28:54.349 Latency(us) 00:28:54.349 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:54.349 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:54.349 Verification LBA range: start 0x0 length 0x3100 00:28:54.349 COMP_lvs0/lv0 : 3.01 1517.55 5.93 0.00 0.00 20995.98 519.88 22282.24 00:28:54.349 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:54.349 Verification LBA range: start 0x3100 length 0x3100 00:28:54.349 COMP_lvs0/lv0 : 3.01 1594.29 6.23 0.00 0.00 19934.93 209.53 22080.59 00:28:54.349 =================================================================================================================== 00:28:54.349 Total : 3111.84 12.16 0.00 0.00 20452.33 209.53 22282.24 00:28:54.349 0 00:28:54.349 13:37:34 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:28:54.349 13:37:34 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:54.349 13:37:34 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:54.349 13:37:35 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:54.349 13:37:35 compress_compdev -- compress/compress.sh@78 -- # killprocess 1067840 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1067840 ']' 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1067840 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1067840 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1067840' 00:28:54.349 killing process with pid 1067840 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@969 -- # kill 1067840 00:28:54.349 Received shutdown signal, test time was about 3.000000 seconds 00:28:54.349 00:28:54.349 Latency(us) 00:28:54.349 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:54.349 =================================================================================================================== 00:28:54.349 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:54.349 13:37:35 compress_compdev -- common/autotest_common.sh@974 -- # wait 1067840 00:28:56.893 13:37:37 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:28:56.893 13:37:37 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:56.893 13:37:37 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1069984 00:28:56.893 13:37:37 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:56.893 13:37:37 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1069984 00:28:56.893 13:37:37 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:56.893 13:37:37 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1069984 ']' 00:28:56.893 13:37:37 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:56.893 13:37:37 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:28:56.893 13:37:37 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:56.893 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:56.893 13:37:37 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:28:56.893 13:37:37 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:56.893 [2024-07-25 13:37:37.493363] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:28:56.893 [2024-07-25 13:37:37.493436] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1069984 ] 00:28:56.893 [2024-07-25 13:37:37.585230] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:57.153 [2024-07-25 13:37:37.695215] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:28:57.154 [2024-07-25 13:37:37.695221] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:28:57.724 [2024-07-25 13:37:38.323442] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:57.724 13:37:38 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:28:57.724 13:37:38 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:28:57.724 13:37:38 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:28:57.724 13:37:38 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:57.724 13:37:38 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:01.022 [2024-07-25 13:37:41.450766] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x16706b0 PMD being used: compress_qat 00:29:01.022 13:37:41 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:01.022 13:37:41 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:29:01.022 13:37:41 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:01.022 13:37:41 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:01.022 13:37:41 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:01.022 13:37:41 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:01.022 13:37:41 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:01.022 13:37:41 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:01.313 [ 00:29:01.313 { 00:29:01.313 "name": "Nvme0n1", 00:29:01.313 "aliases": [ 00:29:01.313 "ff35f155-2d67-4137-968a-27cb75a9ee8d" 00:29:01.313 ], 00:29:01.313 "product_name": "NVMe disk", 00:29:01.313 "block_size": 512, 00:29:01.313 "num_blocks": 3907029168, 00:29:01.313 "uuid": "ff35f155-2d67-4137-968a-27cb75a9ee8d", 00:29:01.313 "assigned_rate_limits": { 00:29:01.313 "rw_ios_per_sec": 0, 00:29:01.313 "rw_mbytes_per_sec": 0, 00:29:01.313 "r_mbytes_per_sec": 0, 00:29:01.313 "w_mbytes_per_sec": 0 00:29:01.313 }, 00:29:01.313 "claimed": false, 00:29:01.313 "zoned": false, 00:29:01.313 "supported_io_types": { 00:29:01.313 "read": true, 00:29:01.313 "write": true, 00:29:01.313 "unmap": true, 00:29:01.313 "flush": true, 00:29:01.313 "reset": true, 00:29:01.313 "nvme_admin": true, 00:29:01.313 "nvme_io": true, 00:29:01.313 "nvme_io_md": false, 00:29:01.313 "write_zeroes": true, 00:29:01.313 "zcopy": false, 00:29:01.313 "get_zone_info": false, 00:29:01.313 "zone_management": false, 00:29:01.313 "zone_append": false, 00:29:01.313 "compare": false, 00:29:01.313 "compare_and_write": false, 00:29:01.313 "abort": true, 00:29:01.313 "seek_hole": false, 00:29:01.313 "seek_data": false, 00:29:01.313 "copy": false, 00:29:01.313 "nvme_iov_md": false 00:29:01.313 }, 00:29:01.313 "driver_specific": { 00:29:01.313 "nvme": [ 00:29:01.313 { 00:29:01.313 "pci_address": "0000:65:00.0", 00:29:01.313 "trid": { 00:29:01.313 "trtype": "PCIe", 00:29:01.313 "traddr": "0000:65:00.0" 00:29:01.313 }, 00:29:01.313 "ctrlr_data": { 00:29:01.313 "cntlid": 0, 00:29:01.313 "vendor_id": "0x8086", 00:29:01.313 "model_number": "INTEL SSDPE2KX020T8", 00:29:01.313 "serial_number": "PHLJ9512038S2P0BGN", 00:29:01.313 "firmware_revision": "VDV10184", 00:29:01.313 "oacs": { 00:29:01.313 "security": 0, 00:29:01.313 "format": 1, 00:29:01.313 "firmware": 1, 00:29:01.313 "ns_manage": 1 00:29:01.313 }, 00:29:01.313 "multi_ctrlr": false, 00:29:01.313 "ana_reporting": false 00:29:01.313 }, 00:29:01.313 "vs": { 00:29:01.313 "nvme_version": "1.2" 00:29:01.313 }, 00:29:01.313 "ns_data": { 00:29:01.313 "id": 1, 00:29:01.313 "can_share": false 00:29:01.313 } 00:29:01.313 } 00:29:01.313 ], 00:29:01.313 "mp_policy": "active_passive" 00:29:01.313 } 00:29:01.313 } 00:29:01.313 ] 00:29:01.313 13:37:41 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:01.313 13:37:41 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:01.576 [2024-07-25 13:37:42.096696] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x14a7600 PMD being used: compress_qat 00:29:02.518 722f7b69-269a-418a-b17e-aecf86b0184d 00:29:02.518 13:37:43 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:02.777 ad7cbcac-ae77-4ba6-a4e6-6d8c93747974 00:29:02.777 13:37:43 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:02.777 13:37:43 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:29:02.777 13:37:43 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:02.777 13:37:43 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:02.777 13:37:43 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:02.777 13:37:43 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:02.777 13:37:43 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:03.036 13:37:43 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:03.297 [ 00:29:03.297 { 00:29:03.297 "name": "ad7cbcac-ae77-4ba6-a4e6-6d8c93747974", 00:29:03.297 "aliases": [ 00:29:03.297 "lvs0/lv0" 00:29:03.297 ], 00:29:03.297 "product_name": "Logical Volume", 00:29:03.297 "block_size": 512, 00:29:03.297 "num_blocks": 204800, 00:29:03.297 "uuid": "ad7cbcac-ae77-4ba6-a4e6-6d8c93747974", 00:29:03.297 "assigned_rate_limits": { 00:29:03.297 "rw_ios_per_sec": 0, 00:29:03.297 "rw_mbytes_per_sec": 0, 00:29:03.297 "r_mbytes_per_sec": 0, 00:29:03.297 "w_mbytes_per_sec": 0 00:29:03.297 }, 00:29:03.297 "claimed": false, 00:29:03.297 "zoned": false, 00:29:03.297 "supported_io_types": { 00:29:03.297 "read": true, 00:29:03.297 "write": true, 00:29:03.297 "unmap": true, 00:29:03.297 "flush": false, 00:29:03.297 "reset": true, 00:29:03.297 "nvme_admin": false, 00:29:03.297 "nvme_io": false, 00:29:03.297 "nvme_io_md": false, 00:29:03.297 "write_zeroes": true, 00:29:03.297 "zcopy": false, 00:29:03.297 "get_zone_info": false, 00:29:03.297 "zone_management": false, 00:29:03.297 "zone_append": false, 00:29:03.297 "compare": false, 00:29:03.297 "compare_and_write": false, 00:29:03.297 "abort": false, 00:29:03.297 "seek_hole": true, 00:29:03.297 "seek_data": true, 00:29:03.297 "copy": false, 00:29:03.297 "nvme_iov_md": false 00:29:03.297 }, 00:29:03.297 "driver_specific": { 00:29:03.297 "lvol": { 00:29:03.297 "lvol_store_uuid": "722f7b69-269a-418a-b17e-aecf86b0184d", 00:29:03.297 "base_bdev": "Nvme0n1", 00:29:03.297 "thin_provision": true, 00:29:03.297 "num_allocated_clusters": 0, 00:29:03.297 "snapshot": false, 00:29:03.297 "clone": false, 00:29:03.297 "esnap_clone": false 00:29:03.297 } 00:29:03.297 } 00:29:03.297 } 00:29:03.297 ] 00:29:03.297 13:37:43 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:03.297 13:37:43 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:29:03.297 13:37:43 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:29:03.297 [2024-07-25 13:37:44.062135] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:03.297 COMP_lvs0/lv0 00:29:03.557 13:37:44 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:03.557 13:37:44 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:29:03.557 13:37:44 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:03.557 13:37:44 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:03.557 13:37:44 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:03.557 13:37:44 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:03.557 13:37:44 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:03.557 13:37:44 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:03.818 [ 00:29:03.818 { 00:29:03.818 "name": "COMP_lvs0/lv0", 00:29:03.818 "aliases": [ 00:29:03.818 "64fab970-25b0-5417-87b3-080d47d09f55" 00:29:03.818 ], 00:29:03.818 "product_name": "compress", 00:29:03.818 "block_size": 512, 00:29:03.818 "num_blocks": 200704, 00:29:03.818 "uuid": "64fab970-25b0-5417-87b3-080d47d09f55", 00:29:03.818 "assigned_rate_limits": { 00:29:03.818 "rw_ios_per_sec": 0, 00:29:03.818 "rw_mbytes_per_sec": 0, 00:29:03.818 "r_mbytes_per_sec": 0, 00:29:03.818 "w_mbytes_per_sec": 0 00:29:03.818 }, 00:29:03.818 "claimed": false, 00:29:03.818 "zoned": false, 00:29:03.818 "supported_io_types": { 00:29:03.818 "read": true, 00:29:03.818 "write": true, 00:29:03.818 "unmap": false, 00:29:03.818 "flush": false, 00:29:03.818 "reset": false, 00:29:03.818 "nvme_admin": false, 00:29:03.818 "nvme_io": false, 00:29:03.818 "nvme_io_md": false, 00:29:03.818 "write_zeroes": true, 00:29:03.818 "zcopy": false, 00:29:03.818 "get_zone_info": false, 00:29:03.818 "zone_management": false, 00:29:03.818 "zone_append": false, 00:29:03.818 "compare": false, 00:29:03.818 "compare_and_write": false, 00:29:03.818 "abort": false, 00:29:03.818 "seek_hole": false, 00:29:03.818 "seek_data": false, 00:29:03.818 "copy": false, 00:29:03.818 "nvme_iov_md": false 00:29:03.818 }, 00:29:03.818 "driver_specific": { 00:29:03.818 "compress": { 00:29:03.818 "name": "COMP_lvs0/lv0", 00:29:03.818 "base_bdev_name": "ad7cbcac-ae77-4ba6-a4e6-6d8c93747974", 00:29:03.818 "pm_path": "/tmp/pmem/20e4f35b-d449-4322-93cd-4be7ccf19649" 00:29:03.818 } 00:29:03.818 } 00:29:03.818 } 00:29:03.818 ] 00:29:03.818 13:37:44 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:03.818 13:37:44 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:04.079 [2024-07-25 13:37:44.632086] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f036c1b15c0 PMD being used: compress_qat 00:29:04.079 [2024-07-25 13:37:44.635244] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x169dc40 PMD being used: compress_qat 00:29:04.079 Running I/O for 3 seconds... 00:29:07.383 00:29:07.383 Latency(us) 00:29:07.383 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:07.383 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:07.383 Verification LBA range: start 0x0 length 0x3100 00:29:07.383 COMP_lvs0/lv0 : 3.01 1520.62 5.94 0.00 0.00 20957.98 456.86 21273.99 00:29:07.383 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:07.383 Verification LBA range: start 0x3100 length 0x3100 00:29:07.383 COMP_lvs0/lv0 : 3.01 1597.28 6.24 0.00 0.00 19886.65 319.80 22181.42 00:29:07.383 =================================================================================================================== 00:29:07.383 Total : 3117.89 12.18 0.00 0.00 20408.97 319.80 22181.42 00:29:07.383 0 00:29:07.383 13:37:47 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:07.383 13:37:47 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:07.383 13:37:47 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:07.383 13:37:48 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:07.383 13:37:48 compress_compdev -- compress/compress.sh@78 -- # killprocess 1069984 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1069984 ']' 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1069984 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1069984 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1069984' 00:29:07.383 killing process with pid 1069984 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@969 -- # kill 1069984 00:29:07.383 Received shutdown signal, test time was about 3.000000 seconds 00:29:07.383 00:29:07.383 Latency(us) 00:29:07.383 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:07.383 =================================================================================================================== 00:29:07.383 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:07.383 13:37:48 compress_compdev -- common/autotest_common.sh@974 -- # wait 1069984 00:29:09.929 13:37:50 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:29:09.929 13:37:50 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:09.929 13:37:50 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=1072041 00:29:09.929 13:37:50 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:09.929 13:37:50 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 1072041 00:29:09.929 13:37:50 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:29:09.929 13:37:50 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1072041 ']' 00:29:09.929 13:37:50 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:09.929 13:37:50 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:09.929 13:37:50 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:09.929 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:09.929 13:37:50 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:09.929 13:37:50 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:09.929 [2024-07-25 13:37:50.686487] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:29:09.929 [2024-07-25 13:37:50.686560] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1072041 ] 00:29:10.189 [2024-07-25 13:37:50.778188] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:10.189 [2024-07-25 13:37:50.889271] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:10.189 [2024-07-25 13:37:50.889277] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:10.760 [2024-07-25 13:37:51.516614] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:11.021 13:37:51 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:11.021 13:37:51 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:29:11.021 13:37:51 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:29:11.021 13:37:51 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:11.021 13:37:51 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:14.322 [2024-07-25 13:37:54.648734] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c686b0 PMD being used: compress_qat 00:29:14.322 13:37:54 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:14.322 13:37:54 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:29:14.322 13:37:54 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:14.322 13:37:54 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:14.322 13:37:54 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:14.322 13:37:54 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:14.322 13:37:54 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:14.322 13:37:54 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:14.322 [ 00:29:14.322 { 00:29:14.322 "name": "Nvme0n1", 00:29:14.322 "aliases": [ 00:29:14.322 "396b74f7-5d8e-4dfa-a108-d3665053c97f" 00:29:14.322 ], 00:29:14.322 "product_name": "NVMe disk", 00:29:14.322 "block_size": 512, 00:29:14.322 "num_blocks": 3907029168, 00:29:14.322 "uuid": "396b74f7-5d8e-4dfa-a108-d3665053c97f", 00:29:14.322 "assigned_rate_limits": { 00:29:14.322 "rw_ios_per_sec": 0, 00:29:14.322 "rw_mbytes_per_sec": 0, 00:29:14.322 "r_mbytes_per_sec": 0, 00:29:14.322 "w_mbytes_per_sec": 0 00:29:14.322 }, 00:29:14.322 "claimed": false, 00:29:14.322 "zoned": false, 00:29:14.322 "supported_io_types": { 00:29:14.322 "read": true, 00:29:14.322 "write": true, 00:29:14.322 "unmap": true, 00:29:14.322 "flush": true, 00:29:14.322 "reset": true, 00:29:14.322 "nvme_admin": true, 00:29:14.322 "nvme_io": true, 00:29:14.322 "nvme_io_md": false, 00:29:14.322 "write_zeroes": true, 00:29:14.322 "zcopy": false, 00:29:14.322 "get_zone_info": false, 00:29:14.322 "zone_management": false, 00:29:14.322 "zone_append": false, 00:29:14.322 "compare": false, 00:29:14.322 "compare_and_write": false, 00:29:14.322 "abort": true, 00:29:14.322 "seek_hole": false, 00:29:14.322 "seek_data": false, 00:29:14.322 "copy": false, 00:29:14.322 "nvme_iov_md": false 00:29:14.322 }, 00:29:14.322 "driver_specific": { 00:29:14.322 "nvme": [ 00:29:14.322 { 00:29:14.322 "pci_address": "0000:65:00.0", 00:29:14.322 "trid": { 00:29:14.322 "trtype": "PCIe", 00:29:14.322 "traddr": "0000:65:00.0" 00:29:14.322 }, 00:29:14.322 "ctrlr_data": { 00:29:14.322 "cntlid": 0, 00:29:14.322 "vendor_id": "0x8086", 00:29:14.322 "model_number": "INTEL SSDPE2KX020T8", 00:29:14.322 "serial_number": "PHLJ9512038S2P0BGN", 00:29:14.322 "firmware_revision": "VDV10184", 00:29:14.322 "oacs": { 00:29:14.322 "security": 0, 00:29:14.322 "format": 1, 00:29:14.322 "firmware": 1, 00:29:14.322 "ns_manage": 1 00:29:14.322 }, 00:29:14.322 "multi_ctrlr": false, 00:29:14.322 "ana_reporting": false 00:29:14.322 }, 00:29:14.322 "vs": { 00:29:14.322 "nvme_version": "1.2" 00:29:14.322 }, 00:29:14.322 "ns_data": { 00:29:14.322 "id": 1, 00:29:14.322 "can_share": false 00:29:14.322 } 00:29:14.322 } 00:29:14.322 ], 00:29:14.322 "mp_policy": "active_passive" 00:29:14.322 } 00:29:14.322 } 00:29:14.322 ] 00:29:14.322 13:37:55 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:14.322 13:37:55 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:14.583 [2024-07-25 13:37:55.274577] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a9f600 PMD being used: compress_qat 00:29:15.525 804db219-70c3-41c4-909a-37bb12334d04 00:29:15.785 13:37:56 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:15.785 46fe7d0f-d596-4e5a-9a94-f6a9b9b0ccd2 00:29:15.785 13:37:56 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:15.785 13:37:56 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:29:15.785 13:37:56 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:15.785 13:37:56 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:15.785 13:37:56 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:15.785 13:37:56 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:15.785 13:37:56 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:16.046 13:37:56 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:16.307 [ 00:29:16.307 { 00:29:16.307 "name": "46fe7d0f-d596-4e5a-9a94-f6a9b9b0ccd2", 00:29:16.307 "aliases": [ 00:29:16.307 "lvs0/lv0" 00:29:16.307 ], 00:29:16.307 "product_name": "Logical Volume", 00:29:16.307 "block_size": 512, 00:29:16.307 "num_blocks": 204800, 00:29:16.307 "uuid": "46fe7d0f-d596-4e5a-9a94-f6a9b9b0ccd2", 00:29:16.307 "assigned_rate_limits": { 00:29:16.307 "rw_ios_per_sec": 0, 00:29:16.307 "rw_mbytes_per_sec": 0, 00:29:16.307 "r_mbytes_per_sec": 0, 00:29:16.307 "w_mbytes_per_sec": 0 00:29:16.307 }, 00:29:16.307 "claimed": false, 00:29:16.307 "zoned": false, 00:29:16.307 "supported_io_types": { 00:29:16.307 "read": true, 00:29:16.307 "write": true, 00:29:16.307 "unmap": true, 00:29:16.307 "flush": false, 00:29:16.307 "reset": true, 00:29:16.307 "nvme_admin": false, 00:29:16.307 "nvme_io": false, 00:29:16.307 "nvme_io_md": false, 00:29:16.307 "write_zeroes": true, 00:29:16.307 "zcopy": false, 00:29:16.307 "get_zone_info": false, 00:29:16.307 "zone_management": false, 00:29:16.308 "zone_append": false, 00:29:16.308 "compare": false, 00:29:16.308 "compare_and_write": false, 00:29:16.308 "abort": false, 00:29:16.308 "seek_hole": true, 00:29:16.308 "seek_data": true, 00:29:16.308 "copy": false, 00:29:16.308 "nvme_iov_md": false 00:29:16.308 }, 00:29:16.308 "driver_specific": { 00:29:16.308 "lvol": { 00:29:16.308 "lvol_store_uuid": "804db219-70c3-41c4-909a-37bb12334d04", 00:29:16.308 "base_bdev": "Nvme0n1", 00:29:16.308 "thin_provision": true, 00:29:16.308 "num_allocated_clusters": 0, 00:29:16.308 "snapshot": false, 00:29:16.308 "clone": false, 00:29:16.308 "esnap_clone": false 00:29:16.308 } 00:29:16.308 } 00:29:16.308 } 00:29:16.308 ] 00:29:16.308 13:37:56 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:16.308 13:37:56 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:29:16.308 13:37:56 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:29:16.573 [2024-07-25 13:37:57.157817] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:16.573 COMP_lvs0/lv0 00:29:16.573 13:37:57 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:16.573 13:37:57 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:29:16.573 13:37:57 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:16.573 13:37:57 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:16.573 13:37:57 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:16.573 13:37:57 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:16.573 13:37:57 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:16.835 13:37:57 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:16.835 [ 00:29:16.835 { 00:29:16.835 "name": "COMP_lvs0/lv0", 00:29:16.835 "aliases": [ 00:29:16.835 "85266dd5-8d3c-512a-bfe9-299e1f84d2b8" 00:29:16.835 ], 00:29:16.835 "product_name": "compress", 00:29:16.835 "block_size": 4096, 00:29:16.835 "num_blocks": 25088, 00:29:16.835 "uuid": "85266dd5-8d3c-512a-bfe9-299e1f84d2b8", 00:29:16.835 "assigned_rate_limits": { 00:29:16.835 "rw_ios_per_sec": 0, 00:29:16.835 "rw_mbytes_per_sec": 0, 00:29:16.835 "r_mbytes_per_sec": 0, 00:29:16.835 "w_mbytes_per_sec": 0 00:29:16.835 }, 00:29:16.835 "claimed": false, 00:29:16.835 "zoned": false, 00:29:16.835 "supported_io_types": { 00:29:16.835 "read": true, 00:29:16.835 "write": true, 00:29:16.835 "unmap": false, 00:29:16.835 "flush": false, 00:29:16.835 "reset": false, 00:29:16.835 "nvme_admin": false, 00:29:16.835 "nvme_io": false, 00:29:16.835 "nvme_io_md": false, 00:29:16.835 "write_zeroes": true, 00:29:16.835 "zcopy": false, 00:29:16.835 "get_zone_info": false, 00:29:16.835 "zone_management": false, 00:29:16.835 "zone_append": false, 00:29:16.835 "compare": false, 00:29:16.835 "compare_and_write": false, 00:29:16.835 "abort": false, 00:29:16.835 "seek_hole": false, 00:29:16.835 "seek_data": false, 00:29:16.835 "copy": false, 00:29:16.835 "nvme_iov_md": false 00:29:16.835 }, 00:29:16.835 "driver_specific": { 00:29:16.835 "compress": { 00:29:16.835 "name": "COMP_lvs0/lv0", 00:29:16.835 "base_bdev_name": "46fe7d0f-d596-4e5a-9a94-f6a9b9b0ccd2", 00:29:16.835 "pm_path": "/tmp/pmem/39459912-88d8-44b3-b09e-4fa40246fa58" 00:29:16.835 } 00:29:16.835 } 00:29:16.835 } 00:29:16.835 ] 00:29:16.835 13:37:57 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:16.835 13:37:57 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:17.096 [2024-07-25 13:37:57.715596] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f2c4c1b15c0 PMD being used: compress_qat 00:29:17.096 [2024-07-25 13:37:57.718643] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1c955e0 PMD being used: compress_qat 00:29:17.096 Running I/O for 3 seconds... 00:29:20.402 00:29:20.402 Latency(us) 00:29:20.402 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:20.402 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:20.402 Verification LBA range: start 0x0 length 0x3100 00:29:20.402 COMP_lvs0/lv0 : 3.01 1530.65 5.98 0.00 0.00 20799.64 269.39 21677.29 00:29:20.402 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:20.402 Verification LBA range: start 0x3100 length 0x3100 00:29:20.402 COMP_lvs0/lv0 : 3.01 1592.27 6.22 0.00 0.00 19973.23 579.74 22685.54 00:29:20.402 =================================================================================================================== 00:29:20.402 Total : 3122.92 12.20 0.00 0.00 20378.48 269.39 22685.54 00:29:20.402 0 00:29:20.402 13:38:00 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:20.402 13:38:00 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:20.402 13:38:00 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:20.663 13:38:01 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:20.663 13:38:01 compress_compdev -- compress/compress.sh@78 -- # killprocess 1072041 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1072041 ']' 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1072041 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1072041 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1072041' 00:29:20.663 killing process with pid 1072041 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@969 -- # kill 1072041 00:29:20.663 Received shutdown signal, test time was about 3.000000 seconds 00:29:20.663 00:29:20.663 Latency(us) 00:29:20.663 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:20.663 =================================================================================================================== 00:29:20.663 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:20.663 13:38:01 compress_compdev -- common/autotest_common.sh@974 -- # wait 1072041 00:29:23.207 13:38:03 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:29:23.207 13:38:03 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:23.207 13:38:03 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=1074075 00:29:23.207 13:38:03 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:23.207 13:38:03 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 1074075 00:29:23.207 13:38:03 compress_compdev -- common/autotest_common.sh@831 -- # '[' -z 1074075 ']' 00:29:23.207 13:38:03 compress_compdev -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:23.207 13:38:03 compress_compdev -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:23.207 13:38:03 compress_compdev -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:23.207 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:23.207 13:38:03 compress_compdev -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:23.207 13:38:03 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:29:23.207 13:38:03 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:23.207 [2024-07-25 13:38:03.790408] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:29:23.208 [2024-07-25 13:38:03.790479] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1074075 ] 00:29:23.208 [2024-07-25 13:38:03.883718] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:23.208 [2024-07-25 13:38:03.981163] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:23.208 [2024-07-25 13:38:03.981201] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:29:23.208 [2024-07-25 13:38:03.981201] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:23.780 [2024-07-25 13:38:04.441021] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:24.041 13:38:04 compress_compdev -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:24.041 13:38:04 compress_compdev -- common/autotest_common.sh@864 -- # return 0 00:29:24.041 13:38:04 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:29:24.041 13:38:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:24.041 13:38:04 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:27.344 [2024-07-25 13:38:07.648132] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xec1160 PMD being used: compress_qat 00:29:27.344 13:38:07 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:27.344 13:38:07 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:29:27.344 13:38:07 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:27.344 13:38:07 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:27.344 13:38:07 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:27.344 13:38:07 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:27.344 13:38:07 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:27.344 13:38:07 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:27.344 [ 00:29:27.344 { 00:29:27.344 "name": "Nvme0n1", 00:29:27.344 "aliases": [ 00:29:27.344 "1a885cd4-9e02-4c91-be26-d22754645620" 00:29:27.344 ], 00:29:27.344 "product_name": "NVMe disk", 00:29:27.344 "block_size": 512, 00:29:27.344 "num_blocks": 3907029168, 00:29:27.344 "uuid": "1a885cd4-9e02-4c91-be26-d22754645620", 00:29:27.344 "assigned_rate_limits": { 00:29:27.344 "rw_ios_per_sec": 0, 00:29:27.344 "rw_mbytes_per_sec": 0, 00:29:27.344 "r_mbytes_per_sec": 0, 00:29:27.344 "w_mbytes_per_sec": 0 00:29:27.344 }, 00:29:27.344 "claimed": false, 00:29:27.344 "zoned": false, 00:29:27.344 "supported_io_types": { 00:29:27.344 "read": true, 00:29:27.344 "write": true, 00:29:27.344 "unmap": true, 00:29:27.344 "flush": true, 00:29:27.344 "reset": true, 00:29:27.344 "nvme_admin": true, 00:29:27.344 "nvme_io": true, 00:29:27.344 "nvme_io_md": false, 00:29:27.344 "write_zeroes": true, 00:29:27.344 "zcopy": false, 00:29:27.344 "get_zone_info": false, 00:29:27.344 "zone_management": false, 00:29:27.344 "zone_append": false, 00:29:27.344 "compare": false, 00:29:27.344 "compare_and_write": false, 00:29:27.344 "abort": true, 00:29:27.344 "seek_hole": false, 00:29:27.344 "seek_data": false, 00:29:27.344 "copy": false, 00:29:27.344 "nvme_iov_md": false 00:29:27.344 }, 00:29:27.344 "driver_specific": { 00:29:27.344 "nvme": [ 00:29:27.344 { 00:29:27.344 "pci_address": "0000:65:00.0", 00:29:27.344 "trid": { 00:29:27.344 "trtype": "PCIe", 00:29:27.344 "traddr": "0000:65:00.0" 00:29:27.344 }, 00:29:27.344 "ctrlr_data": { 00:29:27.344 "cntlid": 0, 00:29:27.344 "vendor_id": "0x8086", 00:29:27.344 "model_number": "INTEL SSDPE2KX020T8", 00:29:27.344 "serial_number": "PHLJ9512038S2P0BGN", 00:29:27.344 "firmware_revision": "VDV10184", 00:29:27.344 "oacs": { 00:29:27.344 "security": 0, 00:29:27.344 "format": 1, 00:29:27.344 "firmware": 1, 00:29:27.344 "ns_manage": 1 00:29:27.344 }, 00:29:27.344 "multi_ctrlr": false, 00:29:27.344 "ana_reporting": false 00:29:27.344 }, 00:29:27.344 "vs": { 00:29:27.344 "nvme_version": "1.2" 00:29:27.344 }, 00:29:27.344 "ns_data": { 00:29:27.344 "id": 1, 00:29:27.344 "can_share": false 00:29:27.344 } 00:29:27.344 } 00:29:27.344 ], 00:29:27.344 "mp_policy": "active_passive" 00:29:27.344 } 00:29:27.344 } 00:29:27.344 ] 00:29:27.344 13:38:08 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:27.344 13:38:08 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:27.606 [2024-07-25 13:38:08.272965] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xec1ad0 PMD being used: compress_qat 00:29:28.989 79141023-a0a3-4fd1-aa78-f2c5ab9bd2b4 00:29:28.989 13:38:09 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:28.989 54678218-a44f-4ca9-9125-0a0a97d9b900 00:29:28.989 13:38:09 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:28.989 13:38:09 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:29:28.989 13:38:09 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:28.990 13:38:09 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:28.990 13:38:09 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:28.990 13:38:09 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:28.990 13:38:09 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:28.990 13:38:09 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:29.249 [ 00:29:29.249 { 00:29:29.249 "name": "54678218-a44f-4ca9-9125-0a0a97d9b900", 00:29:29.249 "aliases": [ 00:29:29.249 "lvs0/lv0" 00:29:29.249 ], 00:29:29.249 "product_name": "Logical Volume", 00:29:29.249 "block_size": 512, 00:29:29.249 "num_blocks": 204800, 00:29:29.249 "uuid": "54678218-a44f-4ca9-9125-0a0a97d9b900", 00:29:29.249 "assigned_rate_limits": { 00:29:29.249 "rw_ios_per_sec": 0, 00:29:29.249 "rw_mbytes_per_sec": 0, 00:29:29.249 "r_mbytes_per_sec": 0, 00:29:29.249 "w_mbytes_per_sec": 0 00:29:29.249 }, 00:29:29.249 "claimed": false, 00:29:29.249 "zoned": false, 00:29:29.249 "supported_io_types": { 00:29:29.249 "read": true, 00:29:29.249 "write": true, 00:29:29.249 "unmap": true, 00:29:29.249 "flush": false, 00:29:29.249 "reset": true, 00:29:29.249 "nvme_admin": false, 00:29:29.249 "nvme_io": false, 00:29:29.249 "nvme_io_md": false, 00:29:29.249 "write_zeroes": true, 00:29:29.249 "zcopy": false, 00:29:29.249 "get_zone_info": false, 00:29:29.249 "zone_management": false, 00:29:29.249 "zone_append": false, 00:29:29.249 "compare": false, 00:29:29.249 "compare_and_write": false, 00:29:29.249 "abort": false, 00:29:29.249 "seek_hole": true, 00:29:29.249 "seek_data": true, 00:29:29.249 "copy": false, 00:29:29.249 "nvme_iov_md": false 00:29:29.249 }, 00:29:29.249 "driver_specific": { 00:29:29.249 "lvol": { 00:29:29.249 "lvol_store_uuid": "79141023-a0a3-4fd1-aa78-f2c5ab9bd2b4", 00:29:29.249 "base_bdev": "Nvme0n1", 00:29:29.249 "thin_provision": true, 00:29:29.249 "num_allocated_clusters": 0, 00:29:29.249 "snapshot": false, 00:29:29.249 "clone": false, 00:29:29.249 "esnap_clone": false 00:29:29.249 } 00:29:29.249 } 00:29:29.249 } 00:29:29.249 ] 00:29:29.249 13:38:09 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:29.249 13:38:09 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:29.249 13:38:09 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:29.508 [2024-07-25 13:38:10.151856] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:29.508 COMP_lvs0/lv0 00:29:29.508 13:38:10 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:29.508 13:38:10 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:29:29.508 13:38:10 compress_compdev -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:29.508 13:38:10 compress_compdev -- common/autotest_common.sh@901 -- # local i 00:29:29.508 13:38:10 compress_compdev -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:29.508 13:38:10 compress_compdev -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:29.508 13:38:10 compress_compdev -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:29.768 13:38:10 compress_compdev -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:29.768 [ 00:29:29.768 { 00:29:29.768 "name": "COMP_lvs0/lv0", 00:29:29.768 "aliases": [ 00:29:29.768 "2e9cd3a9-0cbb-59be-8689-11014bdc1ca6" 00:29:29.768 ], 00:29:29.768 "product_name": "compress", 00:29:29.768 "block_size": 512, 00:29:29.768 "num_blocks": 200704, 00:29:29.768 "uuid": "2e9cd3a9-0cbb-59be-8689-11014bdc1ca6", 00:29:29.768 "assigned_rate_limits": { 00:29:29.768 "rw_ios_per_sec": 0, 00:29:29.768 "rw_mbytes_per_sec": 0, 00:29:29.768 "r_mbytes_per_sec": 0, 00:29:29.768 "w_mbytes_per_sec": 0 00:29:29.768 }, 00:29:29.768 "claimed": false, 00:29:29.768 "zoned": false, 00:29:29.768 "supported_io_types": { 00:29:29.768 "read": true, 00:29:29.768 "write": true, 00:29:29.768 "unmap": false, 00:29:29.768 "flush": false, 00:29:29.768 "reset": false, 00:29:29.768 "nvme_admin": false, 00:29:29.768 "nvme_io": false, 00:29:29.768 "nvme_io_md": false, 00:29:29.768 "write_zeroes": true, 00:29:29.768 "zcopy": false, 00:29:29.768 "get_zone_info": false, 00:29:29.768 "zone_management": false, 00:29:29.768 "zone_append": false, 00:29:29.768 "compare": false, 00:29:29.768 "compare_and_write": false, 00:29:29.768 "abort": false, 00:29:29.768 "seek_hole": false, 00:29:29.768 "seek_data": false, 00:29:29.768 "copy": false, 00:29:29.768 "nvme_iov_md": false 00:29:29.768 }, 00:29:29.768 "driver_specific": { 00:29:29.768 "compress": { 00:29:29.768 "name": "COMP_lvs0/lv0", 00:29:29.768 "base_bdev_name": "54678218-a44f-4ca9-9125-0a0a97d9b900", 00:29:29.768 "pm_path": "/tmp/pmem/b7ca5587-0feb-4faf-b79e-7879879114a1" 00:29:29.768 } 00:29:29.768 } 00:29:29.768 } 00:29:29.768 ] 00:29:29.768 13:38:10 compress_compdev -- common/autotest_common.sh@907 -- # return 0 00:29:29.768 13:38:10 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:30.029 [2024-07-25 13:38:10.613264] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f45941b1350 PMD being used: compress_qat 00:29:30.029 I/O targets: 00:29:30.029 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:29:30.029 00:29:30.029 00:29:30.029 CUnit - A unit testing framework for C - Version 2.1-3 00:29:30.029 http://cunit.sourceforge.net/ 00:29:30.029 00:29:30.029 00:29:30.029 Suite: bdevio tests on: COMP_lvs0/lv0 00:29:30.029 Test: blockdev write read block ...passed 00:29:30.029 Test: blockdev write zeroes read block ...passed 00:29:30.029 Test: blockdev write zeroes read no split ...passed 00:29:30.029 Test: blockdev write zeroes read split ...passed 00:29:30.029 Test: blockdev write zeroes read split partial ...passed 00:29:30.029 Test: blockdev reset ...[2024-07-25 13:38:10.747565] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:29:30.029 passed 00:29:30.029 Test: blockdev write read 8 blocks ...passed 00:29:30.029 Test: blockdev write read size > 128k ...passed 00:29:30.029 Test: blockdev write read invalid size ...passed 00:29:30.029 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:30.029 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:30.029 Test: blockdev write read max offset ...passed 00:29:30.029 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:30.029 Test: blockdev writev readv 8 blocks ...passed 00:29:30.029 Test: blockdev writev readv 30 x 1block ...passed 00:29:30.029 Test: blockdev writev readv block ...passed 00:29:30.029 Test: blockdev writev readv size > 128k ...passed 00:29:30.029 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:30.029 Test: blockdev comparev and writev ...passed 00:29:30.029 Test: blockdev nvme passthru rw ...passed 00:29:30.029 Test: blockdev nvme passthru vendor specific ...passed 00:29:30.029 Test: blockdev nvme admin passthru ...passed 00:29:30.029 Test: blockdev copy ...passed 00:29:30.029 00:29:30.029 Run Summary: Type Total Ran Passed Failed Inactive 00:29:30.029 suites 1 1 n/a 0 0 00:29:30.029 tests 23 23 23 0 0 00:29:30.029 asserts 130 130 130 0 n/a 00:29:30.029 00:29:30.029 Elapsed time = 0.348 seconds 00:29:30.029 0 00:29:30.029 13:38:10 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:29:30.029 13:38:10 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:30.289 13:38:10 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:30.549 13:38:11 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:29:30.549 13:38:11 compress_compdev -- compress/compress.sh@62 -- # killprocess 1074075 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@950 -- # '[' -z 1074075 ']' 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@954 -- # kill -0 1074075 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@955 -- # uname 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1074075 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1074075' 00:29:30.549 killing process with pid 1074075 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@969 -- # kill 1074075 00:29:30.549 13:38:11 compress_compdev -- common/autotest_common.sh@974 -- # wait 1074075 00:29:33.093 13:38:13 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:29:33.093 13:38:13 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:29:33.093 00:29:33.093 real 0m49.313s 00:29:33.093 user 1m50.801s 00:29:33.093 sys 0m4.449s 00:29:33.093 13:38:13 compress_compdev -- common/autotest_common.sh@1126 -- # xtrace_disable 00:29:33.093 13:38:13 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:33.093 ************************************ 00:29:33.093 END TEST compress_compdev 00:29:33.093 ************************************ 00:29:33.093 13:38:13 -- spdk/autotest.sh@353 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:33.093 13:38:13 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:29:33.093 13:38:13 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:29:33.093 13:38:13 -- common/autotest_common.sh@10 -- # set +x 00:29:33.093 ************************************ 00:29:33.093 START TEST compress_isal 00:29:33.093 ************************************ 00:29:33.093 13:38:13 compress_isal -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:33.093 * Looking for test storage... 00:29:33.093 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:33.093 13:38:13 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:33.093 13:38:13 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:33.093 13:38:13 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:33.093 13:38:13 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:33.093 13:38:13 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:33.093 13:38:13 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:33.093 13:38:13 compress_isal -- paths/export.sh@5 -- # export PATH 00:29:33.093 13:38:13 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@47 -- # : 0 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:33.093 13:38:13 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1075915 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1075915 00:29:33.093 13:38:13 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1075915 ']' 00:29:33.093 13:38:13 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:33.093 13:38:13 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:33.093 13:38:13 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:33.093 13:38:13 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:33.093 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:33.093 13:38:13 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:33.093 13:38:13 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:33.354 [2024-07-25 13:38:13.899149] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:29:33.354 [2024-07-25 13:38:13.899220] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1075915 ] 00:29:33.354 [2024-07-25 13:38:13.992653] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:33.354 [2024-07-25 13:38:14.102828] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:33.354 [2024-07-25 13:38:14.102972] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:34.298 13:38:14 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:34.298 13:38:14 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:29:34.298 13:38:14 compress_isal -- compress/compress.sh@74 -- # create_vols 00:29:34.298 13:38:14 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:34.298 13:38:14 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:37.684 13:38:17 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:37.684 13:38:17 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:29:37.684 13:38:17 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:37.684 13:38:17 compress_isal -- common/autotest_common.sh@901 -- # local i 00:29:37.684 13:38:17 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:37.684 13:38:17 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:37.684 13:38:17 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:37.684 13:38:18 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:37.684 [ 00:29:37.684 { 00:29:37.684 "name": "Nvme0n1", 00:29:37.684 "aliases": [ 00:29:37.684 "98e494a3-cf39-463c-8e42-f6d25a9745d1" 00:29:37.684 ], 00:29:37.684 "product_name": "NVMe disk", 00:29:37.684 "block_size": 512, 00:29:37.684 "num_blocks": 3907029168, 00:29:37.684 "uuid": "98e494a3-cf39-463c-8e42-f6d25a9745d1", 00:29:37.684 "assigned_rate_limits": { 00:29:37.684 "rw_ios_per_sec": 0, 00:29:37.684 "rw_mbytes_per_sec": 0, 00:29:37.684 "r_mbytes_per_sec": 0, 00:29:37.684 "w_mbytes_per_sec": 0 00:29:37.684 }, 00:29:37.684 "claimed": false, 00:29:37.684 "zoned": false, 00:29:37.684 "supported_io_types": { 00:29:37.684 "read": true, 00:29:37.684 "write": true, 00:29:37.684 "unmap": true, 00:29:37.684 "flush": true, 00:29:37.684 "reset": true, 00:29:37.684 "nvme_admin": true, 00:29:37.684 "nvme_io": true, 00:29:37.684 "nvme_io_md": false, 00:29:37.684 "write_zeroes": true, 00:29:37.684 "zcopy": false, 00:29:37.684 "get_zone_info": false, 00:29:37.684 "zone_management": false, 00:29:37.684 "zone_append": false, 00:29:37.684 "compare": false, 00:29:37.684 "compare_and_write": false, 00:29:37.684 "abort": true, 00:29:37.684 "seek_hole": false, 00:29:37.684 "seek_data": false, 00:29:37.684 "copy": false, 00:29:37.684 "nvme_iov_md": false 00:29:37.684 }, 00:29:37.684 "driver_specific": { 00:29:37.684 "nvme": [ 00:29:37.684 { 00:29:37.684 "pci_address": "0000:65:00.0", 00:29:37.684 "trid": { 00:29:37.684 "trtype": "PCIe", 00:29:37.684 "traddr": "0000:65:00.0" 00:29:37.684 }, 00:29:37.684 "ctrlr_data": { 00:29:37.684 "cntlid": 0, 00:29:37.684 "vendor_id": "0x8086", 00:29:37.684 "model_number": "INTEL SSDPE2KX020T8", 00:29:37.684 "serial_number": "PHLJ9512038S2P0BGN", 00:29:37.684 "firmware_revision": "VDV10184", 00:29:37.684 "oacs": { 00:29:37.684 "security": 0, 00:29:37.684 "format": 1, 00:29:37.684 "firmware": 1, 00:29:37.684 "ns_manage": 1 00:29:37.684 }, 00:29:37.684 "multi_ctrlr": false, 00:29:37.684 "ana_reporting": false 00:29:37.684 }, 00:29:37.684 "vs": { 00:29:37.684 "nvme_version": "1.2" 00:29:37.684 }, 00:29:37.684 "ns_data": { 00:29:37.684 "id": 1, 00:29:37.684 "can_share": false 00:29:37.684 } 00:29:37.684 } 00:29:37.684 ], 00:29:37.684 "mp_policy": "active_passive" 00:29:37.684 } 00:29:37.684 } 00:29:37.684 ] 00:29:37.684 13:38:18 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:29:37.684 13:38:18 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:39.120 ad880a6d-5b29-4bea-b1bb-947682bb02c6 00:29:39.120 13:38:19 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:39.120 4623a913-258d-41bd-b568-4e734ecdf8be 00:29:39.120 13:38:19 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:39.120 13:38:19 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:29:39.120 13:38:19 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:39.120 13:38:19 compress_isal -- common/autotest_common.sh@901 -- # local i 00:29:39.120 13:38:19 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:39.120 13:38:19 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:39.120 13:38:19 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:39.381 13:38:19 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:39.381 [ 00:29:39.381 { 00:29:39.381 "name": "4623a913-258d-41bd-b568-4e734ecdf8be", 00:29:39.381 "aliases": [ 00:29:39.381 "lvs0/lv0" 00:29:39.381 ], 00:29:39.381 "product_name": "Logical Volume", 00:29:39.381 "block_size": 512, 00:29:39.381 "num_blocks": 204800, 00:29:39.381 "uuid": "4623a913-258d-41bd-b568-4e734ecdf8be", 00:29:39.381 "assigned_rate_limits": { 00:29:39.381 "rw_ios_per_sec": 0, 00:29:39.381 "rw_mbytes_per_sec": 0, 00:29:39.381 "r_mbytes_per_sec": 0, 00:29:39.381 "w_mbytes_per_sec": 0 00:29:39.381 }, 00:29:39.381 "claimed": false, 00:29:39.381 "zoned": false, 00:29:39.381 "supported_io_types": { 00:29:39.381 "read": true, 00:29:39.381 "write": true, 00:29:39.381 "unmap": true, 00:29:39.381 "flush": false, 00:29:39.381 "reset": true, 00:29:39.381 "nvme_admin": false, 00:29:39.381 "nvme_io": false, 00:29:39.381 "nvme_io_md": false, 00:29:39.381 "write_zeroes": true, 00:29:39.381 "zcopy": false, 00:29:39.381 "get_zone_info": false, 00:29:39.381 "zone_management": false, 00:29:39.381 "zone_append": false, 00:29:39.381 "compare": false, 00:29:39.381 "compare_and_write": false, 00:29:39.381 "abort": false, 00:29:39.381 "seek_hole": true, 00:29:39.381 "seek_data": true, 00:29:39.381 "copy": false, 00:29:39.381 "nvme_iov_md": false 00:29:39.381 }, 00:29:39.381 "driver_specific": { 00:29:39.381 "lvol": { 00:29:39.381 "lvol_store_uuid": "ad880a6d-5b29-4bea-b1bb-947682bb02c6", 00:29:39.381 "base_bdev": "Nvme0n1", 00:29:39.381 "thin_provision": true, 00:29:39.381 "num_allocated_clusters": 0, 00:29:39.381 "snapshot": false, 00:29:39.381 "clone": false, 00:29:39.381 "esnap_clone": false 00:29:39.381 } 00:29:39.381 } 00:29:39.381 } 00:29:39.381 ] 00:29:39.643 13:38:20 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:29:39.643 13:38:20 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:39.643 13:38:20 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:39.643 [2024-07-25 13:38:20.375726] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:39.643 COMP_lvs0/lv0 00:29:39.643 13:38:20 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:39.643 13:38:20 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:29:39.643 13:38:20 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:39.643 13:38:20 compress_isal -- common/autotest_common.sh@901 -- # local i 00:29:39.643 13:38:20 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:39.643 13:38:20 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:39.643 13:38:20 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:39.904 13:38:20 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:40.167 [ 00:29:40.167 { 00:29:40.167 "name": "COMP_lvs0/lv0", 00:29:40.167 "aliases": [ 00:29:40.167 "1ab20c01-5926-59df-a5e6-9ea1e6154523" 00:29:40.167 ], 00:29:40.167 "product_name": "compress", 00:29:40.167 "block_size": 512, 00:29:40.167 "num_blocks": 200704, 00:29:40.167 "uuid": "1ab20c01-5926-59df-a5e6-9ea1e6154523", 00:29:40.167 "assigned_rate_limits": { 00:29:40.167 "rw_ios_per_sec": 0, 00:29:40.167 "rw_mbytes_per_sec": 0, 00:29:40.167 "r_mbytes_per_sec": 0, 00:29:40.167 "w_mbytes_per_sec": 0 00:29:40.167 }, 00:29:40.167 "claimed": false, 00:29:40.167 "zoned": false, 00:29:40.167 "supported_io_types": { 00:29:40.167 "read": true, 00:29:40.167 "write": true, 00:29:40.167 "unmap": false, 00:29:40.167 "flush": false, 00:29:40.167 "reset": false, 00:29:40.167 "nvme_admin": false, 00:29:40.167 "nvme_io": false, 00:29:40.167 "nvme_io_md": false, 00:29:40.167 "write_zeroes": true, 00:29:40.167 "zcopy": false, 00:29:40.167 "get_zone_info": false, 00:29:40.167 "zone_management": false, 00:29:40.167 "zone_append": false, 00:29:40.167 "compare": false, 00:29:40.167 "compare_and_write": false, 00:29:40.167 "abort": false, 00:29:40.167 "seek_hole": false, 00:29:40.167 "seek_data": false, 00:29:40.167 "copy": false, 00:29:40.167 "nvme_iov_md": false 00:29:40.167 }, 00:29:40.167 "driver_specific": { 00:29:40.167 "compress": { 00:29:40.167 "name": "COMP_lvs0/lv0", 00:29:40.167 "base_bdev_name": "4623a913-258d-41bd-b568-4e734ecdf8be", 00:29:40.167 "pm_path": "/tmp/pmem/1cfb297d-0ba3-4e0c-9345-596e4226077b" 00:29:40.167 } 00:29:40.167 } 00:29:40.167 } 00:29:40.167 ] 00:29:40.167 13:38:20 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:29:40.167 13:38:20 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:40.428 Running I/O for 3 seconds... 00:29:43.734 00:29:43.734 Latency(us) 00:29:43.734 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:43.734 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:43.734 Verification LBA range: start 0x0 length 0x3100 00:29:43.734 COMP_lvs0/lv0 : 3.01 1091.09 4.26 0.00 0.00 29217.47 1134.28 29642.44 00:29:43.734 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:43.734 Verification LBA range: start 0x3100 length 0x3100 00:29:43.734 COMP_lvs0/lv0 : 3.02 1099.99 4.30 0.00 0.00 28933.39 351.31 29440.79 00:29:43.734 =================================================================================================================== 00:29:43.734 Total : 2191.08 8.56 0.00 0.00 29074.81 351.31 29642.44 00:29:43.734 0 00:29:43.734 13:38:24 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:29:43.734 13:38:24 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:43.734 13:38:24 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:43.734 13:38:24 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:43.734 13:38:24 compress_isal -- compress/compress.sh@78 -- # killprocess 1075915 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1075915 ']' 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1075915 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@955 -- # uname 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1075915 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1075915' 00:29:43.734 killing process with pid 1075915 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@969 -- # kill 1075915 00:29:43.734 Received shutdown signal, test time was about 3.000000 seconds 00:29:43.734 00:29:43.734 Latency(us) 00:29:43.734 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:43.734 =================================================================================================================== 00:29:43.734 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:43.734 13:38:24 compress_isal -- common/autotest_common.sh@974 -- # wait 1075915 00:29:46.283 13:38:26 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:29:46.283 13:38:26 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:46.283 13:38:26 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1077995 00:29:46.283 13:38:26 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:46.283 13:38:26 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1077995 00:29:46.283 13:38:26 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:46.283 13:38:26 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1077995 ']' 00:29:46.283 13:38:26 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:46.283 13:38:26 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:46.283 13:38:26 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:46.283 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:46.283 13:38:26 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:46.283 13:38:26 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:46.283 [2024-07-25 13:38:26.979261] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:29:46.283 [2024-07-25 13:38:26.979331] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1077995 ] 00:29:46.283 [2024-07-25 13:38:27.072217] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:46.544 [2024-07-25 13:38:27.182436] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:46.544 [2024-07-25 13:38:27.182444] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:29:47.115 13:38:27 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:29:47.115 13:38:27 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:29:47.115 13:38:27 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:29:47.115 13:38:27 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:47.115 13:38:27 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:50.415 13:38:30 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:50.415 13:38:30 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:29:50.415 13:38:30 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:50.415 13:38:30 compress_isal -- common/autotest_common.sh@901 -- # local i 00:29:50.415 13:38:30 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:50.415 13:38:30 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:50.415 13:38:30 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:50.415 13:38:31 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:50.675 [ 00:29:50.675 { 00:29:50.675 "name": "Nvme0n1", 00:29:50.675 "aliases": [ 00:29:50.675 "07d5cf6f-13bf-45ea-8ece-e6cf7c94f5a6" 00:29:50.675 ], 00:29:50.675 "product_name": "NVMe disk", 00:29:50.675 "block_size": 512, 00:29:50.675 "num_blocks": 3907029168, 00:29:50.675 "uuid": "07d5cf6f-13bf-45ea-8ece-e6cf7c94f5a6", 00:29:50.675 "assigned_rate_limits": { 00:29:50.675 "rw_ios_per_sec": 0, 00:29:50.675 "rw_mbytes_per_sec": 0, 00:29:50.675 "r_mbytes_per_sec": 0, 00:29:50.675 "w_mbytes_per_sec": 0 00:29:50.675 }, 00:29:50.675 "claimed": false, 00:29:50.675 "zoned": false, 00:29:50.675 "supported_io_types": { 00:29:50.675 "read": true, 00:29:50.675 "write": true, 00:29:50.675 "unmap": true, 00:29:50.675 "flush": true, 00:29:50.675 "reset": true, 00:29:50.675 "nvme_admin": true, 00:29:50.675 "nvme_io": true, 00:29:50.675 "nvme_io_md": false, 00:29:50.675 "write_zeroes": true, 00:29:50.675 "zcopy": false, 00:29:50.675 "get_zone_info": false, 00:29:50.675 "zone_management": false, 00:29:50.675 "zone_append": false, 00:29:50.675 "compare": false, 00:29:50.675 "compare_and_write": false, 00:29:50.675 "abort": true, 00:29:50.675 "seek_hole": false, 00:29:50.675 "seek_data": false, 00:29:50.675 "copy": false, 00:29:50.675 "nvme_iov_md": false 00:29:50.675 }, 00:29:50.675 "driver_specific": { 00:29:50.675 "nvme": [ 00:29:50.675 { 00:29:50.675 "pci_address": "0000:65:00.0", 00:29:50.675 "trid": { 00:29:50.675 "trtype": "PCIe", 00:29:50.675 "traddr": "0000:65:00.0" 00:29:50.675 }, 00:29:50.675 "ctrlr_data": { 00:29:50.675 "cntlid": 0, 00:29:50.675 "vendor_id": "0x8086", 00:29:50.675 "model_number": "INTEL SSDPE2KX020T8", 00:29:50.675 "serial_number": "PHLJ9512038S2P0BGN", 00:29:50.675 "firmware_revision": "VDV10184", 00:29:50.675 "oacs": { 00:29:50.675 "security": 0, 00:29:50.675 "format": 1, 00:29:50.675 "firmware": 1, 00:29:50.675 "ns_manage": 1 00:29:50.675 }, 00:29:50.675 "multi_ctrlr": false, 00:29:50.675 "ana_reporting": false 00:29:50.675 }, 00:29:50.675 "vs": { 00:29:50.676 "nvme_version": "1.2" 00:29:50.676 }, 00:29:50.676 "ns_data": { 00:29:50.676 "id": 1, 00:29:50.676 "can_share": false 00:29:50.676 } 00:29:50.676 } 00:29:50.676 ], 00:29:50.676 "mp_policy": "active_passive" 00:29:50.676 } 00:29:50.676 } 00:29:50.676 ] 00:29:50.676 13:38:31 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:29:50.676 13:38:31 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:52.058 3ac83fe4-a722-48f2-8e7e-55d141cf0d53 00:29:52.058 13:38:32 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:52.058 c5a2f9a1-ee83-4db4-ba29-b32016342520 00:29:52.319 13:38:32 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:52.319 13:38:32 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:29:52.319 13:38:32 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:52.319 13:38:32 compress_isal -- common/autotest_common.sh@901 -- # local i 00:29:52.319 13:38:32 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:52.319 13:38:32 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:52.319 13:38:32 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:52.319 13:38:33 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:52.580 [ 00:29:52.580 { 00:29:52.580 "name": "c5a2f9a1-ee83-4db4-ba29-b32016342520", 00:29:52.580 "aliases": [ 00:29:52.580 "lvs0/lv0" 00:29:52.580 ], 00:29:52.580 "product_name": "Logical Volume", 00:29:52.580 "block_size": 512, 00:29:52.580 "num_blocks": 204800, 00:29:52.580 "uuid": "c5a2f9a1-ee83-4db4-ba29-b32016342520", 00:29:52.580 "assigned_rate_limits": { 00:29:52.580 "rw_ios_per_sec": 0, 00:29:52.580 "rw_mbytes_per_sec": 0, 00:29:52.580 "r_mbytes_per_sec": 0, 00:29:52.580 "w_mbytes_per_sec": 0 00:29:52.580 }, 00:29:52.580 "claimed": false, 00:29:52.580 "zoned": false, 00:29:52.580 "supported_io_types": { 00:29:52.580 "read": true, 00:29:52.580 "write": true, 00:29:52.580 "unmap": true, 00:29:52.580 "flush": false, 00:29:52.580 "reset": true, 00:29:52.580 "nvme_admin": false, 00:29:52.580 "nvme_io": false, 00:29:52.580 "nvme_io_md": false, 00:29:52.580 "write_zeroes": true, 00:29:52.580 "zcopy": false, 00:29:52.580 "get_zone_info": false, 00:29:52.580 "zone_management": false, 00:29:52.580 "zone_append": false, 00:29:52.580 "compare": false, 00:29:52.580 "compare_and_write": false, 00:29:52.580 "abort": false, 00:29:52.580 "seek_hole": true, 00:29:52.580 "seek_data": true, 00:29:52.581 "copy": false, 00:29:52.581 "nvme_iov_md": false 00:29:52.581 }, 00:29:52.581 "driver_specific": { 00:29:52.581 "lvol": { 00:29:52.581 "lvol_store_uuid": "3ac83fe4-a722-48f2-8e7e-55d141cf0d53", 00:29:52.581 "base_bdev": "Nvme0n1", 00:29:52.581 "thin_provision": true, 00:29:52.581 "num_allocated_clusters": 0, 00:29:52.581 "snapshot": false, 00:29:52.581 "clone": false, 00:29:52.581 "esnap_clone": false 00:29:52.581 } 00:29:52.581 } 00:29:52.581 } 00:29:52.581 ] 00:29:52.581 13:38:33 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:29:52.581 13:38:33 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:29:52.581 13:38:33 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:29:52.840 [2024-07-25 13:38:33.470242] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:52.840 COMP_lvs0/lv0 00:29:52.840 13:38:33 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:52.840 13:38:33 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:29:52.840 13:38:33 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:29:52.840 13:38:33 compress_isal -- common/autotest_common.sh@901 -- # local i 00:29:52.840 13:38:33 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:29:52.840 13:38:33 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:29:52.840 13:38:33 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:53.100 13:38:33 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:53.360 [ 00:29:53.360 { 00:29:53.360 "name": "COMP_lvs0/lv0", 00:29:53.360 "aliases": [ 00:29:53.360 "e7a56f8d-7b2a-5d49-84ea-48f0de1cd618" 00:29:53.360 ], 00:29:53.360 "product_name": "compress", 00:29:53.360 "block_size": 512, 00:29:53.360 "num_blocks": 200704, 00:29:53.360 "uuid": "e7a56f8d-7b2a-5d49-84ea-48f0de1cd618", 00:29:53.360 "assigned_rate_limits": { 00:29:53.360 "rw_ios_per_sec": 0, 00:29:53.360 "rw_mbytes_per_sec": 0, 00:29:53.360 "r_mbytes_per_sec": 0, 00:29:53.360 "w_mbytes_per_sec": 0 00:29:53.360 }, 00:29:53.360 "claimed": false, 00:29:53.360 "zoned": false, 00:29:53.360 "supported_io_types": { 00:29:53.360 "read": true, 00:29:53.360 "write": true, 00:29:53.360 "unmap": false, 00:29:53.360 "flush": false, 00:29:53.360 "reset": false, 00:29:53.360 "nvme_admin": false, 00:29:53.360 "nvme_io": false, 00:29:53.360 "nvme_io_md": false, 00:29:53.360 "write_zeroes": true, 00:29:53.360 "zcopy": false, 00:29:53.360 "get_zone_info": false, 00:29:53.360 "zone_management": false, 00:29:53.360 "zone_append": false, 00:29:53.360 "compare": false, 00:29:53.360 "compare_and_write": false, 00:29:53.360 "abort": false, 00:29:53.360 "seek_hole": false, 00:29:53.360 "seek_data": false, 00:29:53.360 "copy": false, 00:29:53.360 "nvme_iov_md": false 00:29:53.360 }, 00:29:53.360 "driver_specific": { 00:29:53.360 "compress": { 00:29:53.360 "name": "COMP_lvs0/lv0", 00:29:53.360 "base_bdev_name": "c5a2f9a1-ee83-4db4-ba29-b32016342520", 00:29:53.360 "pm_path": "/tmp/pmem/f6aeea27-8177-49a0-ab65-8a5c2dcabd09" 00:29:53.360 } 00:29:53.360 } 00:29:53.360 } 00:29:53.360 ] 00:29:53.360 13:38:33 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:29:53.361 13:38:33 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:53.361 Running I/O for 3 seconds... 00:29:56.658 00:29:56.658 Latency(us) 00:29:56.658 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:56.658 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:56.658 Verification LBA range: start 0x0 length 0x3100 00:29:56.658 COMP_lvs0/lv0 : 3.02 1092.11 4.27 0.00 0.00 29168.69 281.99 31658.93 00:29:56.658 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:56.658 Verification LBA range: start 0x3100 length 0x3100 00:29:56.658 COMP_lvs0/lv0 : 3.02 1093.68 4.27 0.00 0.00 29077.02 223.70 31658.93 00:29:56.658 =================================================================================================================== 00:29:56.658 Total : 2185.80 8.54 0.00 0.00 29122.80 223.70 31658.93 00:29:56.658 0 00:29:56.658 13:38:37 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:29:56.658 13:38:37 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:56.658 13:38:37 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:56.919 13:38:37 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:56.919 13:38:37 compress_isal -- compress/compress.sh@78 -- # killprocess 1077995 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1077995 ']' 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1077995 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@955 -- # uname 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1077995 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1077995' 00:29:56.919 killing process with pid 1077995 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@969 -- # kill 1077995 00:29:56.919 Received shutdown signal, test time was about 3.000000 seconds 00:29:56.919 00:29:56.919 Latency(us) 00:29:56.919 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:56.919 =================================================================================================================== 00:29:56.919 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:56.919 13:38:37 compress_isal -- common/autotest_common.sh@974 -- # wait 1077995 00:29:59.462 13:38:39 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:29:59.462 13:38:39 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:59.462 13:38:39 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=1080003 00:29:59.462 13:38:39 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:59.462 13:38:39 compress_isal -- compress/compress.sh@73 -- # waitforlisten 1080003 00:29:59.462 13:38:39 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:59.462 13:38:39 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1080003 ']' 00:29:59.462 13:38:39 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:59.462 13:38:39 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:29:59.462 13:38:39 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:59.462 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:59.462 13:38:39 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:29:59.462 13:38:39 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:59.462 [2024-07-25 13:38:40.046070] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:29:59.462 [2024-07-25 13:38:40.046142] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1080003 ] 00:29:59.462 [2024-07-25 13:38:40.140528] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:59.462 [2024-07-25 13:38:40.251092] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:29:59.462 [2024-07-25 13:38:40.251098] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:00.402 13:38:40 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:00.402 13:38:40 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:30:00.402 13:38:40 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:30:00.402 13:38:40 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:00.402 13:38:40 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:03.702 13:38:43 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:03.702 13:38:43 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:30:03.702 13:38:43 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:03.702 13:38:43 compress_isal -- common/autotest_common.sh@901 -- # local i 00:30:03.702 13:38:43 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:03.702 13:38:43 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:03.702 13:38:43 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:03.702 13:38:44 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:03.702 [ 00:30:03.702 { 00:30:03.702 "name": "Nvme0n1", 00:30:03.702 "aliases": [ 00:30:03.702 "901b07e8-e05a-42f8-ab4c-edbfc938b55b" 00:30:03.702 ], 00:30:03.702 "product_name": "NVMe disk", 00:30:03.702 "block_size": 512, 00:30:03.702 "num_blocks": 3907029168, 00:30:03.702 "uuid": "901b07e8-e05a-42f8-ab4c-edbfc938b55b", 00:30:03.702 "assigned_rate_limits": { 00:30:03.702 "rw_ios_per_sec": 0, 00:30:03.702 "rw_mbytes_per_sec": 0, 00:30:03.702 "r_mbytes_per_sec": 0, 00:30:03.702 "w_mbytes_per_sec": 0 00:30:03.702 }, 00:30:03.702 "claimed": false, 00:30:03.702 "zoned": false, 00:30:03.702 "supported_io_types": { 00:30:03.702 "read": true, 00:30:03.702 "write": true, 00:30:03.702 "unmap": true, 00:30:03.702 "flush": true, 00:30:03.702 "reset": true, 00:30:03.702 "nvme_admin": true, 00:30:03.702 "nvme_io": true, 00:30:03.702 "nvme_io_md": false, 00:30:03.702 "write_zeroes": true, 00:30:03.702 "zcopy": false, 00:30:03.702 "get_zone_info": false, 00:30:03.702 "zone_management": false, 00:30:03.702 "zone_append": false, 00:30:03.702 "compare": false, 00:30:03.702 "compare_and_write": false, 00:30:03.702 "abort": true, 00:30:03.702 "seek_hole": false, 00:30:03.702 "seek_data": false, 00:30:03.702 "copy": false, 00:30:03.702 "nvme_iov_md": false 00:30:03.702 }, 00:30:03.702 "driver_specific": { 00:30:03.702 "nvme": [ 00:30:03.702 { 00:30:03.702 "pci_address": "0000:65:00.0", 00:30:03.702 "trid": { 00:30:03.702 "trtype": "PCIe", 00:30:03.702 "traddr": "0000:65:00.0" 00:30:03.702 }, 00:30:03.702 "ctrlr_data": { 00:30:03.702 "cntlid": 0, 00:30:03.702 "vendor_id": "0x8086", 00:30:03.702 "model_number": "INTEL SSDPE2KX020T8", 00:30:03.702 "serial_number": "PHLJ9512038S2P0BGN", 00:30:03.702 "firmware_revision": "VDV10184", 00:30:03.702 "oacs": { 00:30:03.702 "security": 0, 00:30:03.702 "format": 1, 00:30:03.702 "firmware": 1, 00:30:03.702 "ns_manage": 1 00:30:03.702 }, 00:30:03.702 "multi_ctrlr": false, 00:30:03.702 "ana_reporting": false 00:30:03.702 }, 00:30:03.702 "vs": { 00:30:03.702 "nvme_version": "1.2" 00:30:03.702 }, 00:30:03.702 "ns_data": { 00:30:03.702 "id": 1, 00:30:03.702 "can_share": false 00:30:03.702 } 00:30:03.702 } 00:30:03.702 ], 00:30:03.702 "mp_policy": "active_passive" 00:30:03.702 } 00:30:03.702 } 00:30:03.702 ] 00:30:03.702 13:38:44 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:30:03.702 13:38:44 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:05.088 dec4fffb-da12-4564-9ecf-9dea39dec00d 00:30:05.088 13:38:45 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:05.088 3a05f324-103d-4476-ae07-64cf913b72f9 00:30:05.088 13:38:45 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:05.088 13:38:45 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:30:05.088 13:38:45 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:05.088 13:38:45 compress_isal -- common/autotest_common.sh@901 -- # local i 00:30:05.088 13:38:45 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:05.088 13:38:45 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:05.088 13:38:45 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:05.348 13:38:46 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:05.608 [ 00:30:05.608 { 00:30:05.608 "name": "3a05f324-103d-4476-ae07-64cf913b72f9", 00:30:05.608 "aliases": [ 00:30:05.608 "lvs0/lv0" 00:30:05.608 ], 00:30:05.608 "product_name": "Logical Volume", 00:30:05.608 "block_size": 512, 00:30:05.608 "num_blocks": 204800, 00:30:05.608 "uuid": "3a05f324-103d-4476-ae07-64cf913b72f9", 00:30:05.608 "assigned_rate_limits": { 00:30:05.608 "rw_ios_per_sec": 0, 00:30:05.608 "rw_mbytes_per_sec": 0, 00:30:05.608 "r_mbytes_per_sec": 0, 00:30:05.608 "w_mbytes_per_sec": 0 00:30:05.608 }, 00:30:05.608 "claimed": false, 00:30:05.608 "zoned": false, 00:30:05.608 "supported_io_types": { 00:30:05.608 "read": true, 00:30:05.608 "write": true, 00:30:05.608 "unmap": true, 00:30:05.608 "flush": false, 00:30:05.608 "reset": true, 00:30:05.608 "nvme_admin": false, 00:30:05.608 "nvme_io": false, 00:30:05.608 "nvme_io_md": false, 00:30:05.608 "write_zeroes": true, 00:30:05.608 "zcopy": false, 00:30:05.608 "get_zone_info": false, 00:30:05.608 "zone_management": false, 00:30:05.608 "zone_append": false, 00:30:05.608 "compare": false, 00:30:05.608 "compare_and_write": false, 00:30:05.608 "abort": false, 00:30:05.608 "seek_hole": true, 00:30:05.608 "seek_data": true, 00:30:05.608 "copy": false, 00:30:05.608 "nvme_iov_md": false 00:30:05.608 }, 00:30:05.608 "driver_specific": { 00:30:05.608 "lvol": { 00:30:05.608 "lvol_store_uuid": "dec4fffb-da12-4564-9ecf-9dea39dec00d", 00:30:05.608 "base_bdev": "Nvme0n1", 00:30:05.608 "thin_provision": true, 00:30:05.608 "num_allocated_clusters": 0, 00:30:05.608 "snapshot": false, 00:30:05.608 "clone": false, 00:30:05.608 "esnap_clone": false 00:30:05.608 } 00:30:05.608 } 00:30:05.608 } 00:30:05.608 ] 00:30:05.608 13:38:46 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:30:05.608 13:38:46 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:30:05.608 13:38:46 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:30:05.868 [2024-07-25 13:38:46.443736] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:05.868 COMP_lvs0/lv0 00:30:05.868 13:38:46 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:05.868 13:38:46 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:30:05.868 13:38:46 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:05.868 13:38:46 compress_isal -- common/autotest_common.sh@901 -- # local i 00:30:05.868 13:38:46 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:05.868 13:38:46 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:05.868 13:38:46 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:06.128 13:38:46 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:06.128 [ 00:30:06.128 { 00:30:06.128 "name": "COMP_lvs0/lv0", 00:30:06.128 "aliases": [ 00:30:06.128 "a2e0a0dd-4499-568a-a276-6ef1a0d93ccc" 00:30:06.128 ], 00:30:06.128 "product_name": "compress", 00:30:06.128 "block_size": 4096, 00:30:06.128 "num_blocks": 25088, 00:30:06.128 "uuid": "a2e0a0dd-4499-568a-a276-6ef1a0d93ccc", 00:30:06.128 "assigned_rate_limits": { 00:30:06.128 "rw_ios_per_sec": 0, 00:30:06.128 "rw_mbytes_per_sec": 0, 00:30:06.128 "r_mbytes_per_sec": 0, 00:30:06.128 "w_mbytes_per_sec": 0 00:30:06.128 }, 00:30:06.128 "claimed": false, 00:30:06.128 "zoned": false, 00:30:06.128 "supported_io_types": { 00:30:06.128 "read": true, 00:30:06.128 "write": true, 00:30:06.128 "unmap": false, 00:30:06.128 "flush": false, 00:30:06.128 "reset": false, 00:30:06.128 "nvme_admin": false, 00:30:06.128 "nvme_io": false, 00:30:06.128 "nvme_io_md": false, 00:30:06.128 "write_zeroes": true, 00:30:06.128 "zcopy": false, 00:30:06.128 "get_zone_info": false, 00:30:06.128 "zone_management": false, 00:30:06.128 "zone_append": false, 00:30:06.128 "compare": false, 00:30:06.128 "compare_and_write": false, 00:30:06.128 "abort": false, 00:30:06.128 "seek_hole": false, 00:30:06.128 "seek_data": false, 00:30:06.128 "copy": false, 00:30:06.128 "nvme_iov_md": false 00:30:06.128 }, 00:30:06.128 "driver_specific": { 00:30:06.128 "compress": { 00:30:06.128 "name": "COMP_lvs0/lv0", 00:30:06.128 "base_bdev_name": "3a05f324-103d-4476-ae07-64cf913b72f9", 00:30:06.128 "pm_path": "/tmp/pmem/d4d0a655-439d-41c3-9807-e38836fb21b8" 00:30:06.128 } 00:30:06.128 } 00:30:06.128 } 00:30:06.128 ] 00:30:06.128 13:38:46 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:30:06.128 13:38:46 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:30:06.388 Running I/O for 3 seconds... 00:30:09.686 00:30:09.686 Latency(us) 00:30:09.686 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:09.686 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:30:09.686 Verification LBA range: start 0x0 length 0x3100 00:30:09.686 COMP_lvs0/lv0 : 3.02 1121.77 4.38 0.00 0.00 28401.52 765.64 29440.79 00:30:09.686 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:30:09.686 Verification LBA range: start 0x3100 length 0x3100 00:30:09.686 COMP_lvs0/lv0 : 3.02 1128.15 4.41 0.00 0.00 28187.61 500.97 28835.84 00:30:09.686 =================================================================================================================== 00:30:09.686 Total : 2249.92 8.79 0.00 0.00 28294.25 500.97 29440.79 00:30:09.686 0 00:30:09.686 13:38:50 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:30:09.686 13:38:50 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:09.686 13:38:50 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:09.946 13:38:50 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:30:09.946 13:38:50 compress_isal -- compress/compress.sh@78 -- # killprocess 1080003 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1080003 ']' 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1080003 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@955 -- # uname 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1080003 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1080003' 00:30:09.946 killing process with pid 1080003 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@969 -- # kill 1080003 00:30:09.946 Received shutdown signal, test time was about 3.000000 seconds 00:30:09.946 00:30:09.946 Latency(us) 00:30:09.946 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:09.946 =================================================================================================================== 00:30:09.946 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:30:09.946 13:38:50 compress_isal -- common/autotest_common.sh@974 -- # wait 1080003 00:30:12.517 13:38:52 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:30:12.517 13:38:52 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:30:12.517 13:38:52 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=1082100 00:30:12.517 13:38:52 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:30:12.517 13:38:52 compress_isal -- compress/compress.sh@57 -- # waitforlisten 1082100 00:30:12.517 13:38:52 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:30:12.517 13:38:52 compress_isal -- common/autotest_common.sh@831 -- # '[' -z 1082100 ']' 00:30:12.517 13:38:52 compress_isal -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:12.517 13:38:52 compress_isal -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:12.517 13:38:52 compress_isal -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:12.517 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:12.517 13:38:52 compress_isal -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:12.517 13:38:52 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:12.517 [2024-07-25 13:38:52.979712] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:30:12.517 [2024-07-25 13:38:52.979782] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1082100 ] 00:30:12.517 [2024-07-25 13:38:53.071633] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:12.517 [2024-07-25 13:38:53.166416] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:12.517 [2024-07-25 13:38:53.166595] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:12.517 [2024-07-25 13:38:53.166671] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:13.088 13:38:53 compress_isal -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:13.088 13:38:53 compress_isal -- common/autotest_common.sh@864 -- # return 0 00:30:13.088 13:38:53 compress_isal -- compress/compress.sh@58 -- # create_vols 00:30:13.088 13:38:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:30:13.088 13:38:53 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:30:16.422 13:38:56 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:30:16.422 13:38:56 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=Nvme0n1 00:30:16.422 13:38:56 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:16.422 13:38:56 compress_isal -- common/autotest_common.sh@901 -- # local i 00:30:16.422 13:38:56 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:16.422 13:38:56 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:16.422 13:38:56 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:16.422 13:38:57 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:30:16.684 [ 00:30:16.684 { 00:30:16.684 "name": "Nvme0n1", 00:30:16.684 "aliases": [ 00:30:16.684 "7e1e5812-2b57-4b94-8aca-22f88d1a7e69" 00:30:16.684 ], 00:30:16.684 "product_name": "NVMe disk", 00:30:16.684 "block_size": 512, 00:30:16.684 "num_blocks": 3907029168, 00:30:16.684 "uuid": "7e1e5812-2b57-4b94-8aca-22f88d1a7e69", 00:30:16.684 "assigned_rate_limits": { 00:30:16.684 "rw_ios_per_sec": 0, 00:30:16.684 "rw_mbytes_per_sec": 0, 00:30:16.684 "r_mbytes_per_sec": 0, 00:30:16.684 "w_mbytes_per_sec": 0 00:30:16.684 }, 00:30:16.684 "claimed": false, 00:30:16.684 "zoned": false, 00:30:16.684 "supported_io_types": { 00:30:16.684 "read": true, 00:30:16.684 "write": true, 00:30:16.684 "unmap": true, 00:30:16.684 "flush": true, 00:30:16.684 "reset": true, 00:30:16.684 "nvme_admin": true, 00:30:16.684 "nvme_io": true, 00:30:16.684 "nvme_io_md": false, 00:30:16.684 "write_zeroes": true, 00:30:16.684 "zcopy": false, 00:30:16.684 "get_zone_info": false, 00:30:16.684 "zone_management": false, 00:30:16.684 "zone_append": false, 00:30:16.684 "compare": false, 00:30:16.684 "compare_and_write": false, 00:30:16.684 "abort": true, 00:30:16.684 "seek_hole": false, 00:30:16.684 "seek_data": false, 00:30:16.684 "copy": false, 00:30:16.684 "nvme_iov_md": false 00:30:16.684 }, 00:30:16.684 "driver_specific": { 00:30:16.684 "nvme": [ 00:30:16.684 { 00:30:16.684 "pci_address": "0000:65:00.0", 00:30:16.684 "trid": { 00:30:16.684 "trtype": "PCIe", 00:30:16.684 "traddr": "0000:65:00.0" 00:30:16.684 }, 00:30:16.684 "ctrlr_data": { 00:30:16.684 "cntlid": 0, 00:30:16.684 "vendor_id": "0x8086", 00:30:16.684 "model_number": "INTEL SSDPE2KX020T8", 00:30:16.684 "serial_number": "PHLJ9512038S2P0BGN", 00:30:16.684 "firmware_revision": "VDV10184", 00:30:16.684 "oacs": { 00:30:16.684 "security": 0, 00:30:16.684 "format": 1, 00:30:16.684 "firmware": 1, 00:30:16.684 "ns_manage": 1 00:30:16.684 }, 00:30:16.684 "multi_ctrlr": false, 00:30:16.684 "ana_reporting": false 00:30:16.684 }, 00:30:16.684 "vs": { 00:30:16.684 "nvme_version": "1.2" 00:30:16.684 }, 00:30:16.684 "ns_data": { 00:30:16.684 "id": 1, 00:30:16.684 "can_share": false 00:30:16.684 } 00:30:16.684 } 00:30:16.684 ], 00:30:16.684 "mp_policy": "active_passive" 00:30:16.684 } 00:30:16.684 } 00:30:16.684 ] 00:30:16.684 13:38:57 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:30:16.684 13:38:57 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:30:18.067 ab6d0574-8806-4f52-b6be-785dea7f57da 00:30:18.067 13:38:58 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:30:18.067 344a21e5-34f4-4eeb-a8db-6bc685afdfd2 00:30:18.067 13:38:58 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:30:18.067 13:38:58 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=lvs0/lv0 00:30:18.067 13:38:58 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:18.067 13:38:58 compress_isal -- common/autotest_common.sh@901 -- # local i 00:30:18.067 13:38:58 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:18.067 13:38:58 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:18.067 13:38:58 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:18.328 13:38:59 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:30:18.589 [ 00:30:18.589 { 00:30:18.589 "name": "344a21e5-34f4-4eeb-a8db-6bc685afdfd2", 00:30:18.589 "aliases": [ 00:30:18.589 "lvs0/lv0" 00:30:18.589 ], 00:30:18.589 "product_name": "Logical Volume", 00:30:18.589 "block_size": 512, 00:30:18.589 "num_blocks": 204800, 00:30:18.589 "uuid": "344a21e5-34f4-4eeb-a8db-6bc685afdfd2", 00:30:18.589 "assigned_rate_limits": { 00:30:18.589 "rw_ios_per_sec": 0, 00:30:18.589 "rw_mbytes_per_sec": 0, 00:30:18.589 "r_mbytes_per_sec": 0, 00:30:18.589 "w_mbytes_per_sec": 0 00:30:18.589 }, 00:30:18.589 "claimed": false, 00:30:18.589 "zoned": false, 00:30:18.589 "supported_io_types": { 00:30:18.589 "read": true, 00:30:18.589 "write": true, 00:30:18.589 "unmap": true, 00:30:18.589 "flush": false, 00:30:18.589 "reset": true, 00:30:18.589 "nvme_admin": false, 00:30:18.589 "nvme_io": false, 00:30:18.589 "nvme_io_md": false, 00:30:18.589 "write_zeroes": true, 00:30:18.589 "zcopy": false, 00:30:18.589 "get_zone_info": false, 00:30:18.589 "zone_management": false, 00:30:18.589 "zone_append": false, 00:30:18.589 "compare": false, 00:30:18.589 "compare_and_write": false, 00:30:18.589 "abort": false, 00:30:18.589 "seek_hole": true, 00:30:18.589 "seek_data": true, 00:30:18.589 "copy": false, 00:30:18.589 "nvme_iov_md": false 00:30:18.589 }, 00:30:18.589 "driver_specific": { 00:30:18.589 "lvol": { 00:30:18.589 "lvol_store_uuid": "ab6d0574-8806-4f52-b6be-785dea7f57da", 00:30:18.589 "base_bdev": "Nvme0n1", 00:30:18.589 "thin_provision": true, 00:30:18.589 "num_allocated_clusters": 0, 00:30:18.589 "snapshot": false, 00:30:18.589 "clone": false, 00:30:18.589 "esnap_clone": false 00:30:18.589 } 00:30:18.589 } 00:30:18.589 } 00:30:18.589 ] 00:30:18.589 13:38:59 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:30:18.589 13:38:59 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:30:18.589 13:38:59 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:30:18.849 [2024-07-25 13:38:59.402858] vbdev_compress.c:1008:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:30:18.849 COMP_lvs0/lv0 00:30:18.849 13:38:59 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:30:18.849 13:38:59 compress_isal -- common/autotest_common.sh@899 -- # local bdev_name=COMP_lvs0/lv0 00:30:18.849 13:38:59 compress_isal -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:30:18.849 13:38:59 compress_isal -- common/autotest_common.sh@901 -- # local i 00:30:18.849 13:38:59 compress_isal -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:30:18.849 13:38:59 compress_isal -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:30:18.849 13:38:59 compress_isal -- common/autotest_common.sh@904 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:30:18.849 13:38:59 compress_isal -- common/autotest_common.sh@906 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:30:19.108 [ 00:30:19.108 { 00:30:19.108 "name": "COMP_lvs0/lv0", 00:30:19.108 "aliases": [ 00:30:19.108 "e9d1960b-4018-57c6-a6b2-a7bb6d6ddc9b" 00:30:19.108 ], 00:30:19.108 "product_name": "compress", 00:30:19.108 "block_size": 512, 00:30:19.108 "num_blocks": 200704, 00:30:19.108 "uuid": "e9d1960b-4018-57c6-a6b2-a7bb6d6ddc9b", 00:30:19.108 "assigned_rate_limits": { 00:30:19.108 "rw_ios_per_sec": 0, 00:30:19.108 "rw_mbytes_per_sec": 0, 00:30:19.108 "r_mbytes_per_sec": 0, 00:30:19.108 "w_mbytes_per_sec": 0 00:30:19.108 }, 00:30:19.108 "claimed": false, 00:30:19.108 "zoned": false, 00:30:19.108 "supported_io_types": { 00:30:19.108 "read": true, 00:30:19.108 "write": true, 00:30:19.108 "unmap": false, 00:30:19.108 "flush": false, 00:30:19.108 "reset": false, 00:30:19.108 "nvme_admin": false, 00:30:19.108 "nvme_io": false, 00:30:19.108 "nvme_io_md": false, 00:30:19.108 "write_zeroes": true, 00:30:19.108 "zcopy": false, 00:30:19.108 "get_zone_info": false, 00:30:19.108 "zone_management": false, 00:30:19.108 "zone_append": false, 00:30:19.108 "compare": false, 00:30:19.108 "compare_and_write": false, 00:30:19.108 "abort": false, 00:30:19.108 "seek_hole": false, 00:30:19.108 "seek_data": false, 00:30:19.108 "copy": false, 00:30:19.108 "nvme_iov_md": false 00:30:19.108 }, 00:30:19.108 "driver_specific": { 00:30:19.108 "compress": { 00:30:19.108 "name": "COMP_lvs0/lv0", 00:30:19.108 "base_bdev_name": "344a21e5-34f4-4eeb-a8db-6bc685afdfd2", 00:30:19.108 "pm_path": "/tmp/pmem/23ecdd99-e634-434f-88d5-72942f2751d1" 00:30:19.108 } 00:30:19.108 } 00:30:19.108 } 00:30:19.108 ] 00:30:19.108 13:38:59 compress_isal -- common/autotest_common.sh@907 -- # return 0 00:30:19.108 13:38:59 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:19.368 I/O targets: 00:30:19.368 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:30:19.368 00:30:19.368 00:30:19.368 CUnit - A unit testing framework for C - Version 2.1-3 00:30:19.368 http://cunit.sourceforge.net/ 00:30:19.368 00:30:19.368 00:30:19.368 Suite: bdevio tests on: COMP_lvs0/lv0 00:30:19.368 Test: blockdev write read block ...passed 00:30:19.368 Test: blockdev write zeroes read block ...passed 00:30:19.368 Test: blockdev write zeroes read no split ...passed 00:30:19.368 Test: blockdev write zeroes read split ...passed 00:30:19.368 Test: blockdev write zeroes read split partial ...passed 00:30:19.368 Test: blockdev reset ...[2024-07-25 13:39:00.083088] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:30:19.368 passed 00:30:19.368 Test: blockdev write read 8 blocks ...passed 00:30:19.368 Test: blockdev write read size > 128k ...passed 00:30:19.368 Test: blockdev write read invalid size ...passed 00:30:19.368 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:19.368 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:19.368 Test: blockdev write read max offset ...passed 00:30:19.368 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:19.368 Test: blockdev writev readv 8 blocks ...passed 00:30:19.369 Test: blockdev writev readv 30 x 1block ...passed 00:30:19.369 Test: blockdev writev readv block ...passed 00:30:19.369 Test: blockdev writev readv size > 128k ...passed 00:30:19.369 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:19.369 Test: blockdev comparev and writev ...passed 00:30:19.369 Test: blockdev nvme passthru rw ...passed 00:30:19.369 Test: blockdev nvme passthru vendor specific ...passed 00:30:19.369 Test: blockdev nvme admin passthru ...passed 00:30:19.369 Test: blockdev copy ...passed 00:30:19.369 00:30:19.369 Run Summary: Type Total Ran Passed Failed Inactive 00:30:19.369 suites 1 1 n/a 0 0 00:30:19.369 tests 23 23 23 0 0 00:30:19.369 asserts 130 130 130 0 n/a 00:30:19.369 00:30:19.369 Elapsed time = 0.428 seconds 00:30:19.369 0 00:30:19.369 13:39:00 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:30:19.369 13:39:00 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:30:19.629 13:39:00 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:30:19.889 13:39:00 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:30:19.889 13:39:00 compress_isal -- compress/compress.sh@62 -- # killprocess 1082100 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@950 -- # '[' -z 1082100 ']' 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@954 -- # kill -0 1082100 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@955 -- # uname 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1082100 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1082100' 00:30:19.889 killing process with pid 1082100 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@969 -- # kill 1082100 00:30:19.889 13:39:00 compress_isal -- common/autotest_common.sh@974 -- # wait 1082100 00:30:22.433 13:39:03 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:30:22.433 13:39:03 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:22.433 00:30:22.433 real 0m49.306s 00:30:22.433 user 1m52.114s 00:30:22.433 sys 0m3.433s 00:30:22.433 13:39:03 compress_isal -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:22.433 13:39:03 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:22.433 ************************************ 00:30:22.433 END TEST compress_isal 00:30:22.433 ************************************ 00:30:22.433 13:39:03 -- spdk/autotest.sh@356 -- # '[' 0 -eq 1 ']' 00:30:22.433 13:39:03 -- spdk/autotest.sh@360 -- # '[' 1 -eq 1 ']' 00:30:22.433 13:39:03 -- spdk/autotest.sh@361 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:22.433 13:39:03 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:30:22.433 13:39:03 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:22.433 13:39:03 -- common/autotest_common.sh@10 -- # set +x 00:30:22.433 ************************************ 00:30:22.433 START TEST blockdev_crypto_aesni 00:30:22.433 ************************************ 00:30:22.433 13:39:03 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:22.433 * Looking for test storage... 00:30:22.433 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:30:22.433 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # uname -s 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@681 -- # test_type=crypto_aesni 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # crypto_device= 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # dek= 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # env_ctx= 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == bdev ]] 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@689 -- # [[ crypto_aesni == crypto_* ]] 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1083982 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 1083982 00:30:22.434 13:39:03 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:30:22.434 13:39:03 blockdev_crypto_aesni -- common/autotest_common.sh@831 -- # '[' -z 1083982 ']' 00:30:22.434 13:39:03 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:22.434 13:39:03 blockdev_crypto_aesni -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:22.434 13:39:03 blockdev_crypto_aesni -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:22.434 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:22.434 13:39:03 blockdev_crypto_aesni -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:22.434 13:39:03 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:22.694 [2024-07-25 13:39:03.267831] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:30:22.694 [2024-07-25 13:39:03.267898] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1083982 ] 00:30:22.695 [2024-07-25 13:39:03.358485] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:22.695 [2024-07-25 13:39:03.426750] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:23.634 13:39:04 blockdev_crypto_aesni -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:23.634 13:39:04 blockdev_crypto_aesni -- common/autotest_common.sh@864 -- # return 0 00:30:23.634 13:39:04 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:30:23.634 13:39:04 blockdev_crypto_aesni -- bdev/blockdev.sh@704 -- # setup_crypto_aesni_conf 00:30:23.634 13:39:04 blockdev_crypto_aesni -- bdev/blockdev.sh@145 -- # rpc_cmd 00:30:23.634 13:39:04 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:23.634 13:39:04 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:23.634 [2024-07-25 13:39:04.120729] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:23.634 [2024-07-25 13:39:04.128760] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:23.634 [2024-07-25 13:39:04.136776] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:23.634 [2024-07-25 13:39:04.195028] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:26.173 true 00:30:26.173 true 00:30:26.173 true 00:30:26.173 true 00:30:26.173 Malloc0 00:30:26.173 Malloc1 00:30:26.173 Malloc2 00:30:26.173 Malloc3 00:30:26.173 [2024-07-25 13:39:06.474738] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:26.173 crypto_ram 00:30:26.173 [2024-07-25 13:39:06.482760] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:26.173 crypto_ram2 00:30:26.173 [2024-07-25 13:39:06.490776] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:26.173 crypto_ram3 00:30:26.173 [2024-07-25 13:39:06.498798] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:26.173 crypto_ram4 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # cat 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@561 -- # xtrace_disable 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r .name 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f00dd951-b358-54b6-abb5-6e7cd2acb55e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f00dd951-b358-54b6-abb5-6e7cd2acb55e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "1abd6d37-a1d5-5ec0-93b8-aff478c73e72"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1abd6d37-a1d5-5ec0-93b8-aff478c73e72",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "06bb7e5b-d3b8-5654-abed-87db269e11c6"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "06bb7e5b-d3b8-5654-abed-87db269e11c6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "fb2bdebb-e0d6-536c-84c0-cb66febb476c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "fb2bdebb-e0d6-536c-84c0-cb66febb476c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:30:26.173 13:39:06 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # killprocess 1083982 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@950 -- # '[' -z 1083982 ']' 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # kill -0 1083982 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # uname 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1083982 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1083982' 00:30:26.173 killing process with pid 1083982 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@969 -- # kill 1083982 00:30:26.173 13:39:06 blockdev_crypto_aesni -- common/autotest_common.sh@974 -- # wait 1083982 00:30:26.434 13:39:07 blockdev_crypto_aesni -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:26.434 13:39:07 blockdev_crypto_aesni -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:26.434 13:39:07 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:30:26.434 13:39:07 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:26.434 13:39:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:26.434 ************************************ 00:30:26.434 START TEST bdev_hello_world 00:30:26.434 ************************************ 00:30:26.434 13:39:07 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:26.434 [2024-07-25 13:39:07.145825] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:30:26.434 [2024-07-25 13:39:07.145872] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1084622 ] 00:30:26.694 [2024-07-25 13:39:07.231081] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:26.694 [2024-07-25 13:39:07.302300] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:26.694 [2024-07-25 13:39:07.323371] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:26.694 [2024-07-25 13:39:07.331393] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:26.694 [2024-07-25 13:39:07.339410] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:26.694 [2024-07-25 13:39:07.424338] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:29.235 [2024-07-25 13:39:09.593359] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:29.235 [2024-07-25 13:39:09.593405] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:29.235 [2024-07-25 13:39:09.593413] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:29.235 [2024-07-25 13:39:09.601377] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:29.235 [2024-07-25 13:39:09.601389] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:29.235 [2024-07-25 13:39:09.601395] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:29.235 [2024-07-25 13:39:09.609396] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:29.235 [2024-07-25 13:39:09.609408] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:29.235 [2024-07-25 13:39:09.609414] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:29.235 [2024-07-25 13:39:09.617415] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:29.235 [2024-07-25 13:39:09.617426] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:29.235 [2024-07-25 13:39:09.617432] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:29.235 [2024-07-25 13:39:09.678766] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:30:29.235 [2024-07-25 13:39:09.678795] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:30:29.235 [2024-07-25 13:39:09.678805] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:30:29.235 [2024-07-25 13:39:09.679833] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:30:29.235 [2024-07-25 13:39:09.679898] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:30:29.235 [2024-07-25 13:39:09.679908] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:30:29.235 [2024-07-25 13:39:09.679940] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:30:29.235 00:30:29.235 [2024-07-25 13:39:09.679950] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:30:29.235 00:30:29.235 real 0m2.818s 00:30:29.235 user 0m2.539s 00:30:29.235 sys 0m0.239s 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:30:29.235 ************************************ 00:30:29.235 END TEST bdev_hello_world 00:30:29.235 ************************************ 00:30:29.235 13:39:09 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:30:29.235 13:39:09 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:30:29.235 13:39:09 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:29.235 13:39:09 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:29.235 ************************************ 00:30:29.235 START TEST bdev_bounds 00:30:29.235 ************************************ 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1085301 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1085301' 00:30:29.235 Process bdevio pid: 1085301 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1085301 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1085301 ']' 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:29.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:29.235 13:39:09 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:29.496 [2024-07-25 13:39:10.044159] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:30:29.496 [2024-07-25 13:39:10.044206] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1085301 ] 00:30:29.496 [2024-07-25 13:39:10.131191] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:29.496 [2024-07-25 13:39:10.197575] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:30:29.496 [2024-07-25 13:39:10.197665] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:29.496 [2024-07-25 13:39:10.197665] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:30:29.496 [2024-07-25 13:39:10.218808] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:29.496 [2024-07-25 13:39:10.226834] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:29.496 [2024-07-25 13:39:10.234853] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:29.755 [2024-07-25 13:39:10.323078] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:32.297 [2024-07-25 13:39:12.487309] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:32.297 [2024-07-25 13:39:12.487365] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:32.297 [2024-07-25 13:39:12.487373] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:32.297 [2024-07-25 13:39:12.495329] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:32.297 [2024-07-25 13:39:12.495341] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:32.297 [2024-07-25 13:39:12.495347] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:32.297 [2024-07-25 13:39:12.503350] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:32.297 [2024-07-25 13:39:12.503362] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:32.297 [2024-07-25 13:39:12.503367] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:32.297 [2024-07-25 13:39:12.511369] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:32.297 [2024-07-25 13:39:12.511381] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:32.297 [2024-07-25 13:39:12.511387] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:32.297 13:39:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:32.297 13:39:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:30:32.297 13:39:12 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:32.297 I/O targets: 00:30:32.297 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:32.297 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:30:32.297 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:32.297 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:30:32.297 00:30:32.297 00:30:32.297 CUnit - A unit testing framework for C - Version 2.1-3 00:30:32.297 http://cunit.sourceforge.net/ 00:30:32.297 00:30:32.297 00:30:32.297 Suite: bdevio tests on: crypto_ram4 00:30:32.297 Test: blockdev write read block ...passed 00:30:32.297 Test: blockdev write zeroes read block ...passed 00:30:32.297 Test: blockdev write zeroes read no split ...passed 00:30:32.297 Test: blockdev write zeroes read split ...passed 00:30:32.297 Test: blockdev write zeroes read split partial ...passed 00:30:32.297 Test: blockdev reset ...passed 00:30:32.297 Test: blockdev write read 8 blocks ...passed 00:30:32.297 Test: blockdev write read size > 128k ...passed 00:30:32.297 Test: blockdev write read invalid size ...passed 00:30:32.297 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:32.297 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:32.297 Test: blockdev write read max offset ...passed 00:30:32.297 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:32.297 Test: blockdev writev readv 8 blocks ...passed 00:30:32.297 Test: blockdev writev readv 30 x 1block ...passed 00:30:32.297 Test: blockdev writev readv block ...passed 00:30:32.297 Test: blockdev writev readv size > 128k ...passed 00:30:32.297 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:32.297 Test: blockdev comparev and writev ...passed 00:30:32.297 Test: blockdev nvme passthru rw ...passed 00:30:32.297 Test: blockdev nvme passthru vendor specific ...passed 00:30:32.297 Test: blockdev nvme admin passthru ...passed 00:30:32.297 Test: blockdev copy ...passed 00:30:32.297 Suite: bdevio tests on: crypto_ram3 00:30:32.297 Test: blockdev write read block ...passed 00:30:32.297 Test: blockdev write zeroes read block ...passed 00:30:32.297 Test: blockdev write zeroes read no split ...passed 00:30:32.297 Test: blockdev write zeroes read split ...passed 00:30:32.297 Test: blockdev write zeroes read split partial ...passed 00:30:32.297 Test: blockdev reset ...passed 00:30:32.297 Test: blockdev write read 8 blocks ...passed 00:30:32.297 Test: blockdev write read size > 128k ...passed 00:30:32.297 Test: blockdev write read invalid size ...passed 00:30:32.297 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:32.297 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:32.297 Test: blockdev write read max offset ...passed 00:30:32.297 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:32.297 Test: blockdev writev readv 8 blocks ...passed 00:30:32.297 Test: blockdev writev readv 30 x 1block ...passed 00:30:32.297 Test: blockdev writev readv block ...passed 00:30:32.297 Test: blockdev writev readv size > 128k ...passed 00:30:32.297 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:32.297 Test: blockdev comparev and writev ...passed 00:30:32.297 Test: blockdev nvme passthru rw ...passed 00:30:32.297 Test: blockdev nvme passthru vendor specific ...passed 00:30:32.297 Test: blockdev nvme admin passthru ...passed 00:30:32.297 Test: blockdev copy ...passed 00:30:32.297 Suite: bdevio tests on: crypto_ram2 00:30:32.297 Test: blockdev write read block ...passed 00:30:32.297 Test: blockdev write zeroes read block ...passed 00:30:32.297 Test: blockdev write zeroes read no split ...passed 00:30:32.297 Test: blockdev write zeroes read split ...passed 00:30:32.557 Test: blockdev write zeroes read split partial ...passed 00:30:32.557 Test: blockdev reset ...passed 00:30:32.557 Test: blockdev write read 8 blocks ...passed 00:30:32.557 Test: blockdev write read size > 128k ...passed 00:30:32.557 Test: blockdev write read invalid size ...passed 00:30:32.557 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:32.557 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:32.557 Test: blockdev write read max offset ...passed 00:30:32.557 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:32.557 Test: blockdev writev readv 8 blocks ...passed 00:30:32.557 Test: blockdev writev readv 30 x 1block ...passed 00:30:32.557 Test: blockdev writev readv block ...passed 00:30:32.557 Test: blockdev writev readv size > 128k ...passed 00:30:32.557 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:32.557 Test: blockdev comparev and writev ...passed 00:30:32.557 Test: blockdev nvme passthru rw ...passed 00:30:32.557 Test: blockdev nvme passthru vendor specific ...passed 00:30:32.557 Test: blockdev nvme admin passthru ...passed 00:30:32.557 Test: blockdev copy ...passed 00:30:32.558 Suite: bdevio tests on: crypto_ram 00:30:32.558 Test: blockdev write read block ...passed 00:30:32.558 Test: blockdev write zeroes read block ...passed 00:30:32.558 Test: blockdev write zeroes read no split ...passed 00:30:32.816 Test: blockdev write zeroes read split ...passed 00:30:33.075 Test: blockdev write zeroes read split partial ...passed 00:30:33.075 Test: blockdev reset ...passed 00:30:33.075 Test: blockdev write read 8 blocks ...passed 00:30:33.075 Test: blockdev write read size > 128k ...passed 00:30:33.075 Test: blockdev write read invalid size ...passed 00:30:33.075 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:33.075 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:33.075 Test: blockdev write read max offset ...passed 00:30:33.075 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:33.075 Test: blockdev writev readv 8 blocks ...passed 00:30:33.075 Test: blockdev writev readv 30 x 1block ...passed 00:30:33.075 Test: blockdev writev readv block ...passed 00:30:33.075 Test: blockdev writev readv size > 128k ...passed 00:30:33.075 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:33.075 Test: blockdev comparev and writev ...passed 00:30:33.075 Test: blockdev nvme passthru rw ...passed 00:30:33.075 Test: blockdev nvme passthru vendor specific ...passed 00:30:33.075 Test: blockdev nvme admin passthru ...passed 00:30:33.075 Test: blockdev copy ...passed 00:30:33.075 00:30:33.075 Run Summary: Type Total Ran Passed Failed Inactive 00:30:33.075 suites 4 4 n/a 0 0 00:30:33.075 tests 92 92 92 0 0 00:30:33.075 asserts 520 520 520 0 n/a 00:30:33.075 00:30:33.075 Elapsed time = 1.884 seconds 00:30:33.075 0 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1085301 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1085301 ']' 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1085301 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1085301 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1085301' 00:30:33.075 killing process with pid 1085301 00:30:33.075 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1085301 00:30:33.076 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1085301 00:30:33.336 13:39:13 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:30:33.336 00:30:33.336 real 0m3.955s 00:30:33.336 user 0m10.723s 00:30:33.336 sys 0m0.383s 00:30:33.336 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:33.336 13:39:13 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:33.336 ************************************ 00:30:33.336 END TEST bdev_bounds 00:30:33.336 ************************************ 00:30:33.336 13:39:13 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:33.336 13:39:13 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:30:33.336 13:39:13 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:33.336 13:39:13 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:33.336 ************************************ 00:30:33.336 START TEST bdev_nbd 00:30:33.336 ************************************ 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1086181 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1086181 /var/tmp/spdk-nbd.sock 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1086181 ']' 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:33.336 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:30:33.336 13:39:14 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:33.336 [2024-07-25 13:39:14.081225] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:30:33.336 [2024-07-25 13:39:14.081277] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:33.595 [2024-07-25 13:39:14.171464] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:33.595 [2024-07-25 13:39:14.240088] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:30:33.595 [2024-07-25 13:39:14.261144] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:33.595 [2024-07-25 13:39:14.269168] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:33.595 [2024-07-25 13:39:14.277184] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:33.595 [2024-07-25 13:39:14.365574] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:36.134 [2024-07-25 13:39:16.535073] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:36.134 [2024-07-25 13:39:16.535118] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:36.134 [2024-07-25 13:39:16.535126] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.134 [2024-07-25 13:39:16.543091] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:36.134 [2024-07-25 13:39:16.543104] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:36.134 [2024-07-25 13:39:16.543110] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.134 [2024-07-25 13:39:16.551110] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:36.135 [2024-07-25 13:39:16.551122] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:36.135 [2024-07-25 13:39:16.551128] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.135 [2024-07-25 13:39:16.559130] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:36.135 [2024-07-25 13:39:16.559142] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:36.135 [2024-07-25 13:39:16.559147] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:36.135 1+0 records in 00:30:36.135 1+0 records out 00:30:36.135 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278868 s, 14.7 MB/s 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:36.135 13:39:16 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:36.394 1+0 records in 00:30:36.394 1+0 records out 00:30:36.394 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000280779 s, 14.6 MB/s 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:36.394 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:36.654 1+0 records in 00:30:36.654 1+0 records out 00:30:36.654 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000291648 s, 14.0 MB/s 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:36.654 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:30:36.914 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:36.915 1+0 records in 00:30:36.915 1+0 records out 00:30:36.915 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000316378 s, 12.9 MB/s 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:36.915 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd0", 00:30:37.175 "bdev_name": "crypto_ram" 00:30:37.175 }, 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd1", 00:30:37.175 "bdev_name": "crypto_ram2" 00:30:37.175 }, 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd2", 00:30:37.175 "bdev_name": "crypto_ram3" 00:30:37.175 }, 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd3", 00:30:37.175 "bdev_name": "crypto_ram4" 00:30:37.175 } 00:30:37.175 ]' 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd0", 00:30:37.175 "bdev_name": "crypto_ram" 00:30:37.175 }, 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd1", 00:30:37.175 "bdev_name": "crypto_ram2" 00:30:37.175 }, 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd2", 00:30:37.175 "bdev_name": "crypto_ram3" 00:30:37.175 }, 00:30:37.175 { 00:30:37.175 "nbd_device": "/dev/nbd3", 00:30:37.175 "bdev_name": "crypto_ram4" 00:30:37.175 } 00:30:37.175 ]' 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:37.175 13:39:17 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:37.435 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:37.436 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:37.696 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:37.956 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:38.217 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:38.217 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:38.217 13:39:18 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:38.217 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:38.217 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:30:38.480 /dev/nbd0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:38.480 1+0 records in 00:30:38.480 1+0 records out 00:30:38.480 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000190487 s, 21.5 MB/s 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:38.480 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:38.481 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:30:38.843 /dev/nbd1 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:38.843 1+0 records in 00:30:38.843 1+0 records out 00:30:38.843 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000286946 s, 14.3 MB/s 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:38.843 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:30:39.105 /dev/nbd10 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:39.105 1+0 records in 00:30:39.105 1+0 records out 00:30:39.105 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000336837 s, 12.2 MB/s 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:39.105 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:30:39.366 /dev/nbd11 00:30:39.366 13:39:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:39.366 1+0 records in 00:30:39.366 1+0 records out 00:30:39.366 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000325797 s, 12.6 MB/s 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:39.366 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd0", 00:30:39.627 "bdev_name": "crypto_ram" 00:30:39.627 }, 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd1", 00:30:39.627 "bdev_name": "crypto_ram2" 00:30:39.627 }, 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd10", 00:30:39.627 "bdev_name": "crypto_ram3" 00:30:39.627 }, 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd11", 00:30:39.627 "bdev_name": "crypto_ram4" 00:30:39.627 } 00:30:39.627 ]' 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd0", 00:30:39.627 "bdev_name": "crypto_ram" 00:30:39.627 }, 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd1", 00:30:39.627 "bdev_name": "crypto_ram2" 00:30:39.627 }, 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd10", 00:30:39.627 "bdev_name": "crypto_ram3" 00:30:39.627 }, 00:30:39.627 { 00:30:39.627 "nbd_device": "/dev/nbd11", 00:30:39.627 "bdev_name": "crypto_ram4" 00:30:39.627 } 00:30:39.627 ]' 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:30:39.627 /dev/nbd1 00:30:39.627 /dev/nbd10 00:30:39.627 /dev/nbd11' 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:30:39.627 /dev/nbd1 00:30:39.627 /dev/nbd10 00:30:39.627 /dev/nbd11' 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:30:39.627 256+0 records in 00:30:39.627 256+0 records out 00:30:39.627 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0117224 s, 89.5 MB/s 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:30:39.627 256+0 records in 00:30:39.627 256+0 records out 00:30:39.627 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.053809 s, 19.5 MB/s 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:39.627 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:30:39.888 256+0 records in 00:30:39.888 256+0 records out 00:30:39.888 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.056932 s, 18.4 MB/s 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:30:39.888 256+0 records in 00:30:39.888 256+0 records out 00:30:39.888 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0347854 s, 30.1 MB/s 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:30:39.888 256+0 records in 00:30:39.888 256+0 records out 00:30:39.888 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0342125 s, 30.6 MB/s 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:39.888 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:40.148 13:39:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:40.441 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:40.701 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:40.961 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:40.961 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:40.961 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:30:40.962 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:30:41.222 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:30:41.222 malloc_lvol_verify 00:30:41.222 13:39:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:30:41.482 fceee43b-7c11-41fb-8695-317ace7d05d3 00:30:41.482 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:30:41.742 2ace4c56-2913-4a45-91fa-995dd0487a39 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:30:41.742 /dev/nbd0 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:30:41.742 mke2fs 1.46.5 (30-Dec-2021) 00:30:41.742 Discarding device blocks: 0/4096 done 00:30:41.742 Creating filesystem with 4096 1k blocks and 1024 inodes 00:30:41.742 00:30:41.742 Allocating group tables: 0/1 done 00:30:41.742 Writing inode tables: 0/1 done 00:30:41.742 Creating journal (1024 blocks): done 00:30:41.742 Writing superblocks and filesystem accounting information: 0/1 done 00:30:41.742 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:41.742 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1086181 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1086181 ']' 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1086181 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1086181 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1086181' 00:30:42.002 killing process with pid 1086181 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1086181 00:30:42.002 13:39:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1086181 00:30:42.262 13:39:23 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:30:42.262 00:30:42.262 real 0m8.993s 00:30:42.262 user 0m12.476s 00:30:42.262 sys 0m2.565s 00:30:42.262 13:39:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:42.262 13:39:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:42.262 ************************************ 00:30:42.262 END TEST bdev_nbd 00:30:42.262 ************************************ 00:30:42.262 13:39:23 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:30:42.262 13:39:23 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = nvme ']' 00:30:42.262 13:39:23 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # '[' crypto_aesni = gpt ']' 00:30:42.262 13:39:23 blockdev_crypto_aesni -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:30:42.262 13:39:23 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:30:42.262 13:39:23 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:42.262 13:39:23 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:42.523 ************************************ 00:30:42.523 START TEST bdev_fio 00:30:42.523 ************************************ 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:42.523 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram4]' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram4 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:42.523 ************************************ 00:30:42.523 START TEST bdev_fio_rw_verify 00:30:42.523 ************************************ 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:42.523 13:39:23 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:43.092 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:43.092 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:43.092 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:43.092 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:43.092 fio-3.35 00:30:43.092 Starting 4 threads 00:30:57.999 00:30:57.999 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1088640: Thu Jul 25 13:39:36 2024 00:30:57.999 read: IOPS=22.7k, BW=88.6MiB/s (92.9MB/s)(886MiB/10001msec) 00:30:57.999 slat (usec): min=14, max=816, avg=53.58, stdev=48.38 00:30:57.999 clat (usec): min=8, max=2742, avg=300.63, stdev=289.29 00:30:57.999 lat (usec): min=25, max=2868, avg=354.21, stdev=327.41 00:30:57.999 clat percentiles (usec): 00:30:57.999 | 50.000th=[ 192], 99.000th=[ 1450], 99.900th=[ 1811], 99.990th=[ 2278], 00:30:57.999 | 99.999th=[ 2638] 00:30:57.999 write: IOPS=24.9k, BW=97.3MiB/s (102MB/s)(951MiB/9777msec); 0 zone resets 00:30:57.999 slat (usec): min=15, max=509, avg=70.66, stdev=55.94 00:30:57.999 clat (usec): min=34, max=3266, avg=411.09, stdev=389.98 00:30:57.999 lat (usec): min=61, max=3605, avg=481.76, stdev=436.40 00:30:57.999 clat percentiles (usec): 00:30:57.999 | 50.000th=[ 265], 99.000th=[ 1909], 99.900th=[ 2278], 99.990th=[ 2802], 00:30:57.999 | 99.999th=[ 3195] 00:30:57.999 bw ( KiB/s): min=79072, max=112136, per=97.50%, avg=97138.21, stdev=3208.61, samples=76 00:30:57.999 iops : min=19768, max=28034, avg=24284.42, stdev=802.15, samples=76 00:30:57.999 lat (usec) : 10=0.01%, 50=2.74%, 100=12.42%, 250=38.98%, 500=22.91% 00:30:57.999 lat (usec) : 750=12.33%, 1000=4.91% 00:30:57.999 lat (msec) : 2=5.44%, 4=0.27% 00:30:57.999 cpu : usr=99.56%, sys=0.00%, ctx=62, majf=0, minf=256 00:30:57.999 IO depths : 1=10.6%, 2=23.7%, 4=52.5%, 8=13.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:57.999 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:57.999 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:57.999 issued rwts: total=226898,243519,0,0 short=0,0,0,0 dropped=0,0,0,0 00:30:57.999 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:57.999 00:30:57.999 Run status group 0 (all jobs): 00:30:57.999 READ: bw=88.6MiB/s (92.9MB/s), 88.6MiB/s-88.6MiB/s (92.9MB/s-92.9MB/s), io=886MiB (929MB), run=10001-10001msec 00:30:57.999 WRITE: bw=97.3MiB/s (102MB/s), 97.3MiB/s-97.3MiB/s (102MB/s-102MB/s), io=951MiB (997MB), run=9777-9777msec 00:30:57.999 00:30:57.999 real 0m13.498s 00:30:57.999 user 0m49.603s 00:30:57.999 sys 0m0.499s 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:30:57.999 ************************************ 00:30:57.999 END TEST bdev_fio_rw_verify 00:30:57.999 ************************************ 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f00dd951-b358-54b6-abb5-6e7cd2acb55e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f00dd951-b358-54b6-abb5-6e7cd2acb55e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "1abd6d37-a1d5-5ec0-93b8-aff478c73e72"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1abd6d37-a1d5-5ec0-93b8-aff478c73e72",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "06bb7e5b-d3b8-5654-abed-87db269e11c6"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "06bb7e5b-d3b8-5654-abed-87db269e11c6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "fb2bdebb-e0d6-536c-84c0-cb66febb476c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "fb2bdebb-e0d6-536c-84c0-cb66febb476c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:57.999 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:30:57.999 crypto_ram2 00:30:57.999 crypto_ram3 00:30:57.999 crypto_ram4 ]] 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "f00dd951-b358-54b6-abb5-6e7cd2acb55e"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "f00dd951-b358-54b6-abb5-6e7cd2acb55e",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "1abd6d37-a1d5-5ec0-93b8-aff478c73e72"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "1abd6d37-a1d5-5ec0-93b8-aff478c73e72",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "06bb7e5b-d3b8-5654-abed-87db269e11c6"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "06bb7e5b-d3b8-5654-abed-87db269e11c6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "fb2bdebb-e0d6-536c-84c0-cb66febb476c"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "fb2bdebb-e0d6-536c-84c0-cb66febb476c",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram4]' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram4 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:58.000 ************************************ 00:30:58.000 START TEST bdev_fio_trim 00:30:58.000 ************************************ 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:58.000 13:39:36 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:58.000 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.000 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.000 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.000 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:58.000 fio-3.35 00:30:58.000 Starting 4 threads 00:31:10.257 00:31:10.257 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1090943: Thu Jul 25 13:39:50 2024 00:31:10.257 write: IOPS=58.6k, BW=229MiB/s (240MB/s)(2290MiB/10001msec); 0 zone resets 00:31:10.257 slat (usec): min=14, max=501, avg=41.14, stdev=25.46 00:31:10.257 clat (usec): min=17, max=1321, avg=194.90, stdev=129.75 00:31:10.257 lat (usec): min=45, max=1550, avg=236.04, stdev=146.68 00:31:10.257 clat percentiles (usec): 00:31:10.257 | 50.000th=[ 153], 99.000th=[ 545], 99.900th=[ 660], 99.990th=[ 938], 00:31:10.257 | 99.999th=[ 1270] 00:31:10.257 bw ( KiB/s): min=228450, max=260648, per=100.00%, avg=234518.00, stdev=1780.19, samples=76 00:31:10.257 iops : min=57112, max=65162, avg=58629.47, stdev=445.06, samples=76 00:31:10.257 trim: IOPS=58.6k, BW=229MiB/s (240MB/s)(2290MiB/10001msec); 0 zone resets 00:31:10.257 slat (nsec): min=4840, max=51889, avg=8328.79, stdev=4217.34 00:31:10.257 clat (usec): min=29, max=996, avg=167.98, stdev=70.44 00:31:10.257 lat (usec): min=34, max=1002, avg=176.31, stdev=71.17 00:31:10.257 clat percentiles (usec): 00:31:10.257 | 50.000th=[ 157], 99.000th=[ 367], 99.900th=[ 457], 99.990th=[ 619], 00:31:10.257 | 99.999th=[ 840] 00:31:10.257 bw ( KiB/s): min=228450, max=260648, per=100.00%, avg=234519.68, stdev=1780.03, samples=76 00:31:10.257 iops : min=57112, max=65162, avg=58629.89, stdev=445.02, samples=76 00:31:10.257 lat (usec) : 20=0.01%, 50=2.80%, 100=18.91%, 250=58.09%, 500=19.04% 00:31:10.257 lat (usec) : 750=1.14%, 1000=0.01% 00:31:10.257 lat (msec) : 2=0.01% 00:31:10.257 cpu : usr=99.69%, sys=0.00%, ctx=50, majf=0, minf=89 00:31:10.257 IO depths : 1=8.1%, 2=22.1%, 4=55.8%, 8=14.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:10.257 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:10.257 complete : 0=0.0%, 4=87.7%, 8=12.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:10.257 issued rwts: total=0,586219,586220,0 short=0,0,0,0 dropped=0,0,0,0 00:31:10.257 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:10.257 00:31:10.257 Run status group 0 (all jobs): 00:31:10.257 WRITE: bw=229MiB/s (240MB/s), 229MiB/s-229MiB/s (240MB/s-240MB/s), io=2290MiB (2401MB), run=10001-10001msec 00:31:10.257 TRIM: bw=229MiB/s (240MB/s), 229MiB/s-229MiB/s (240MB/s-240MB/s), io=2290MiB (2401MB), run=10001-10001msec 00:31:10.257 00:31:10.257 real 0m13.432s 00:31:10.257 user 0m49.703s 00:31:10.257 sys 0m0.488s 00:31:10.257 13:39:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:10.257 13:39:50 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:31:10.257 ************************************ 00:31:10.257 END TEST bdev_fio_trim 00:31:10.257 ************************************ 00:31:10.258 13:39:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:31:10.258 13:39:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:10.258 13:39:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:31:10.258 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:10.258 13:39:50 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:31:10.258 00:31:10.258 real 0m27.282s 00:31:10.258 user 1m39.470s 00:31:10.258 sys 0m1.192s 00:31:10.258 13:39:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:10.258 13:39:50 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:10.258 ************************************ 00:31:10.258 END TEST bdev_fio 00:31:10.258 ************************************ 00:31:10.258 13:39:50 blockdev_crypto_aesni -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:10.258 13:39:50 blockdev_crypto_aesni -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:10.258 13:39:50 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:31:10.258 13:39:50 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:10.258 13:39:50 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:10.258 ************************************ 00:31:10.258 START TEST bdev_verify 00:31:10.258 ************************************ 00:31:10.258 13:39:50 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:10.258 [2024-07-25 13:39:50.505884] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:10.258 [2024-07-25 13:39:50.505944] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1092687 ] 00:31:10.258 [2024-07-25 13:39:50.599043] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:10.258 [2024-07-25 13:39:50.694533] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:10.258 [2024-07-25 13:39:50.694537] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:10.258 [2024-07-25 13:39:50.715848] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:10.258 [2024-07-25 13:39:50.723875] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:10.258 [2024-07-25 13:39:50.731899] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:10.258 [2024-07-25 13:39:50.834217] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:12.802 [2024-07-25 13:39:53.108770] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:12.802 [2024-07-25 13:39:53.108836] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:12.802 [2024-07-25 13:39:53.108845] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:12.802 [2024-07-25 13:39:53.116784] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:12.802 [2024-07-25 13:39:53.116796] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:12.802 [2024-07-25 13:39:53.116802] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:12.802 [2024-07-25 13:39:53.124804] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:12.802 [2024-07-25 13:39:53.124815] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:12.802 [2024-07-25 13:39:53.124821] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:12.802 [2024-07-25 13:39:53.132825] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:12.802 [2024-07-25 13:39:53.132835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:12.802 [2024-07-25 13:39:53.132841] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:12.802 Running I/O for 5 seconds... 00:31:18.093 00:31:18.093 Latency(us) 00:31:18.093 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:18.093 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x0 length 0x1000 00:31:18.093 crypto_ram : 5.06 597.64 2.33 0.00 0.00 213330.61 2495.41 128248.91 00:31:18.093 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x1000 length 0x1000 00:31:18.093 crypto_ram : 5.06 404.76 1.58 0.00 0.00 314545.23 13913.80 177451.32 00:31:18.093 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x0 length 0x1000 00:31:18.093 crypto_ram2 : 5.06 600.62 2.35 0.00 0.00 211954.70 3037.34 118569.75 00:31:18.093 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x1000 length 0x1000 00:31:18.093 crypto_ram2 : 5.06 404.66 1.58 0.00 0.00 313472.98 14417.92 172611.74 00:31:18.093 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x0 length 0x1000 00:31:18.093 crypto_ram3 : 5.04 4672.15 18.25 0.00 0.00 27165.01 5620.97 22887.19 00:31:18.093 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x1000 length 0x1000 00:31:18.093 crypto_ram3 : 5.05 3165.47 12.37 0.00 0.00 39937.10 2318.97 29642.44 00:31:18.093 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x0 length 0x1000 00:31:18.093 crypto_ram4 : 5.05 4687.41 18.31 0.00 0.00 27051.05 2621.44 22282.24 00:31:18.093 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:18.093 Verification LBA range: start 0x1000 length 0x1000 00:31:18.093 crypto_ram4 : 5.04 3143.52 12.28 0.00 0.00 40526.40 8822.15 56461.78 00:31:18.093 =================================================================================================================== 00:31:18.093 Total : 17676.23 69.05 0.00 0.00 57563.14 2318.97 177451.32 00:31:18.093 00:31:18.093 real 0m8.085s 00:31:18.093 user 0m15.433s 00:31:18.093 sys 0m0.348s 00:31:18.093 13:39:58 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:18.093 13:39:58 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:18.093 ************************************ 00:31:18.093 END TEST bdev_verify 00:31:18.093 ************************************ 00:31:18.093 13:39:58 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:18.093 13:39:58 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:31:18.093 13:39:58 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:18.094 13:39:58 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:18.094 ************************************ 00:31:18.094 START TEST bdev_verify_big_io 00:31:18.094 ************************************ 00:31:18.094 13:39:58 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:18.094 [2024-07-25 13:39:58.666039] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:18.094 [2024-07-25 13:39:58.666084] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1094029 ] 00:31:18.094 [2024-07-25 13:39:58.754581] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:18.094 [2024-07-25 13:39:58.829105] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:18.094 [2024-07-25 13:39:58.829110] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:18.094 [2024-07-25 13:39:58.850245] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:18.094 [2024-07-25 13:39:58.858271] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:18.094 [2024-07-25 13:39:58.866294] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:18.354 [2024-07-25 13:39:58.951157] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:20.901 [2024-07-25 13:40:01.115678] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:20.901 [2024-07-25 13:40:01.115736] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:20.901 [2024-07-25 13:40:01.115744] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:20.901 [2024-07-25 13:40:01.123694] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:20.901 [2024-07-25 13:40:01.123706] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:20.901 [2024-07-25 13:40:01.123717] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:20.901 [2024-07-25 13:40:01.131716] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:20.901 [2024-07-25 13:40:01.131728] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:20.901 [2024-07-25 13:40:01.131734] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:20.901 [2024-07-25 13:40:01.139737] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:20.901 [2024-07-25 13:40:01.139749] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:20.901 [2024-07-25 13:40:01.139755] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:20.901 Running I/O for 5 seconds... 00:31:21.477 [2024-07-25 13:40:02.227115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.227336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.227392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.227449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.227854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.229873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.230192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.231405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.231461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.231516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.231566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.232206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.232254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.232299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.232344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.232712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.233833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.233892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.233937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.233982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.234414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.234462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.234506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.234555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.234917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.236341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.236400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.236446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.236492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.236979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.237033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.237077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.237121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.237470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.238621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.238673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.238718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.238762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.239200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.239248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.239293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.239337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.239847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.241372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.241425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.241469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.241515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.241990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.242039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.242083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.242129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.242487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.243569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.243620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.243664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.243709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.244376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.477 [2024-07-25 13:40:02.244425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.244471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.244518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.244845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.245998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.246054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.246099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.246144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.246561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.246614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.246659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.246704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.247076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.248633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.248686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.248731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.248775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.249208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.249256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.249300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.249344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.249756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.250842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.250893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.250938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.250982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.251467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.251518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.251568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.251612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.251990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.253931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.254308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.255504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.255566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.255611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.255655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.256191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.256245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.256289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.256346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.256806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.257973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.258025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.258074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.258118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.258563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.258611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.258656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.258704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.259025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.260979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.261300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.262407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.262458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.262502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.262552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.262965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.263013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.263057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.263102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.263494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.265131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.478 [2024-07-25 13:40:02.265188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.479 [2024-07-25 13:40:02.265232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.479 [2024-07-25 13:40:02.265276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.479 [2024-07-25 13:40:02.265717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.479 [2024-07-25 13:40:02.265771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.479 [2024-07-25 13:40:02.265820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.479 [2024-07-25 13:40:02.265864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.479 [2024-07-25 13:40:02.266210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.267350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.267402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.267447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.267493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.267958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.268006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.268053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.268097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.268595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.269819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.269872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.269916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.269960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.270455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.270503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.270554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.270600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.270920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.272977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.273334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.274533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.274601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.274648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.274694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.275138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.275190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.275235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.275279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.275611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.276741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.276793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.276838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.276883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.277330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.277378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.277423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.277466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.277863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.278946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.278997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.279041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.279086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.279500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.279553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.279598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.279642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.280074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.743 [2024-07-25 13:40:02.281258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.281310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.281354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.281398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.281898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.281952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.281997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.282041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.282440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.283615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.283671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.283716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.283760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.284304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.284357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.284402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.284461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.284979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.286896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.287264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.288385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.288438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.288483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.288533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.289246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.289296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.289342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.289388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.289723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.290839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.290890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.290935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.290979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.291390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.291438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.291482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.291526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.291951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.293643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.293700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.293746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.293791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.294224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.294276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.294320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.294365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.294756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.295859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.295911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.295955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.295999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.296466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.296514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.296564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.296610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.297121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.298988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.299308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.300384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.300436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.300481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.744 [2024-07-25 13:40:02.300525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.301226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.301278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.301325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.301371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.301709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.302846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.304589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.306544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.308605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.309151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.309960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.311921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.313837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.314162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.317175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.317644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.318839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.320779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.322903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.324866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.326812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.328522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.329017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.331989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.333565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.335549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.337636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.338484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.340058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.341756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.343781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.344146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.346982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.347451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.349171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.351084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.352874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.354760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.356784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.358510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.358998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.361594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.362058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.362517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.362986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.364033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.364497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.364969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.365427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.365872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.367735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.368202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.368666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.369132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.370127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.370597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.371059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.371518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.372069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.373947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.374414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.374880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.375339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.376346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.376814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.377275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.377737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.378216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.379869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.380337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.745 [2024-07-25 13:40:02.380804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.381268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.382343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.382814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.383277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.383739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.384301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.386061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.386527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.386993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.387452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.388471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.388940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.389402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.389865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.390316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.391997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.392465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.392931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.393390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.394412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.394881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.395342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.395805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.396335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.398379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.398850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.399311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.399774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.400757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.401222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.401687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.402146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.402604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.404327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.404796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.405274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.405741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.407970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.409178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.411100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.411563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.412103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.413537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.415191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.416931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.418361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.420762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.421945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.422404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.424353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.424720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.427622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.429737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.430374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.431169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.433801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.435453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.437423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.439333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.439661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.443263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.445195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.746 [2024-07-25 13:40:02.446826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.448901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.451336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.451809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.453539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.455417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.455745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.458760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.460280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.460749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.462462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.465216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.466760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.468459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.470813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.471239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.474343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.476445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.478198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.480322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.481335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.482144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.484049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.486080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.486443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.489733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.490197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.491578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.493460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.495397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.497447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.499255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.500675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.501156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.504379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.505738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.507631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.509850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.510765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.512865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.514836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.516724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.517046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.518476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.519508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.521665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.523871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.526358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.528328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.530059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.530518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:21.747 [2024-07-25 13:40:02.530844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.533295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.535244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.537168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.538437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.540706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.542877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.545030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.546655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.547000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.548560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.550656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.552724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.554451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.556984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.559157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.559619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.560554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.560908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.563727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.565661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.567806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.568267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.570741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.572549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.574051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.575989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.576367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.578441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.580629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.582837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.584371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.586739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.588204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.588667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.590377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.590733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.593927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.596126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.597181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.597643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.600118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.602175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.603812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.605582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.605906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.608990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.611054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.613100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.614830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.617562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.618034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.619063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.621010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.621407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.624456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.626458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.626921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.012 [2024-07-25 13:40:02.628573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.631390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.632903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.634847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.636797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.637188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.640100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.642370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.643786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.645721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.647678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.648141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.649957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.651967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.652292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.655565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.656658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.657117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.659026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.661645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.663229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.665015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.667202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.667645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.670714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.672793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.674521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.676667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.677688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.678486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.680527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.682431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.682794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.685969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.686437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.687604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.689636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.691408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.693424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.695373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.697109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.697625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.700652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.702050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.704154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.706811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.707274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.707914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.710101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.710521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.711680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.712175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.712222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.712271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.712316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.712728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.714572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.714623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.714668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.714713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.715114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.716357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.716409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.716453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.716506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.717053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.717250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.717298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.717345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.717391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.717938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.719230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.719283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.719327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.719372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.719922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.720105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.720153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.720198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.720245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.720711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.722342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.013 [2024-07-25 13:40:02.722395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.722439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.722489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.722979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.723109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.723155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.723201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.723245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.723778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.725353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.725406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.725451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.725495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.726009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.726189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.726242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.726287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.726335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.726781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.728162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.728216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.728261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.728312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.728802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.728983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.729043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.729090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.729136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.729591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.730909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.730961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.731007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.731056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.731620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.731801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.731849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.731894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.731939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.732410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.733853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.733905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.733956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.734001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.734475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.734661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.734710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.734757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.734805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.735220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.736826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.736879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.736926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.736973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.737472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.737607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.737665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.737709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.737759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.738308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.739896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.739960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.740005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.740050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.740587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.740769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.740816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.740864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.740910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.741370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.742908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.742966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.743013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.743060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.743485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.743618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.743665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.743709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.743754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.744257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.745616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.745667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.745724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.745770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.014 [2024-07-25 13:40:02.746270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.746449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.746497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.746542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.746603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.747036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.748303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.748357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.748404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.748449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.748988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.749168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.749219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.749265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.749313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.749746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.751974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.752466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.753766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.753818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.753863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.753908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.754398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.754585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.754640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.754687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.754734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.755165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.756589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.756642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.756692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.756737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.757296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.757492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.757541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.757592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.757639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.758065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.759645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.759698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.759743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.759792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.760254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.760382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.760433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.760478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.760522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.761042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.762403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.762466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.762510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.762559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.763039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.763220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.763269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.763322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.763369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.763855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.765412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.765466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.765511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.765559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.766008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.766192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.766240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.766286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.766333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.766894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.768226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.768279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.768324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.768368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.768829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.769011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.769057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.015 [2024-07-25 13:40:02.769101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.769146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.769465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.770599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.770652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.770697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.770741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.771219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.771400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.771452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.771498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.771544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.771872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.773131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.773182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.773227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.773271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.773828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.774011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.774062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.774109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.774155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.774541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.776947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.777325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.778327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.778379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.778423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.778467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.778945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.779124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.779172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.779218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.779266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.779593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.780668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.780720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.780764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.780809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.781155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.781263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.781313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.781362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.781407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.781770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.783987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.784344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.785345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.785396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.785440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.785484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.785859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.785973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.786019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.786064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.786108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.786514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.787585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.787637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.787682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.787726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.788116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.788223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.788270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.788319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.788363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.788826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.016 [2024-07-25 13:40:02.789946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.789999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.790044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.790088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.790545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.790731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.790782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.790828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.790874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.791343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.792455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.792515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.792564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.792609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.793073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.793252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.793298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.793342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.793387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.793788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.794907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.794959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.795004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.795049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.795545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.795731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.795778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.795824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.795875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.796237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.797362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.797414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.797459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.799318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.799645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.799782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.017 [2024-07-25 13:40:02.799829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.799874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.799922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.800394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.801680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.801782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.803748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.805237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.807124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.807470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.808694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.809819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.811827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.814078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.814470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.816651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.818620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.820181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.820642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.279 [2024-07-25 13:40:02.820964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.823468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.825595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.827734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.828800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.829254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.831103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.833136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.835403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.837115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.837438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.838980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.840837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.842758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.844889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.845275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.847206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.849648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.850456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.850918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.851298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.854075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.856184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.858477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.858941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.859414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.861576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.863428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.864956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.866863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.867228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.869051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.871152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.873051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.874591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.874954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.877131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.879092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.879556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.880774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.881146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.884162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.886228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.887943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.888416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.888803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.890784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.892909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.894390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.896421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.896818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.899672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.901424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.903531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.905022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.905406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.907339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.908884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.909342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.911092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.911446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.914666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.916846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.917870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.918340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.918674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.920890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.922887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.924603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.926680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.927004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.930370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.932367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.934132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.936045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.936398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.938756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.939217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.940439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.942485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.280 [2024-07-25 13:40:02.942812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.946097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.947808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.948282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.949926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.950282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.952650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.954134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.956226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.958273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.958733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.961578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.963772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.965129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.967067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.967438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.969137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.969603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.971332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.973523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.973850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.976925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.978160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.978624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.980333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.980726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.983097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.984550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.986258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.988484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.988915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.991954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.994170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.995667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.997400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.997729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.998756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:02.999220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.001201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.003326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.003654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.006781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.007371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.008020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.009900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.010276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.012216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.013973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.015885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.018129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.018667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.021861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.023903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.025619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.027483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.027811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.028730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.029194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.031369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.033420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.033861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.035329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.035803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.037228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.038732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.039223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.039770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.041658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.042997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.044863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.045250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.047410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.047883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.048349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.048813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.049333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.049879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.050342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.050818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.051278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.051716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.053724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.054195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.281 [2024-07-25 13:40:03.054662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.055123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.055601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.056139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.056606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.057067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.057526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.058060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.059660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.060130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.060597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.061059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.061556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.062094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.062562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.063024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.063482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.064048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.065802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.066270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.066734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.067200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.067676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.068218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.282 [2024-07-25 13:40:03.068689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.069151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.069625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.070145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.072201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.072690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.073153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.073616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.074098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.074639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.075101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.075567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.076026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.076553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.078148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.078621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.079083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.079543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.079947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.080485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.080951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.081414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.081877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.082337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.084108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.084582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.085046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.085507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.086022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.086566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.087040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.087502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.087965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.088502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.090328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.092173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.094576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.095597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.095919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.097706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.098168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.099591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.101467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.101935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.103614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.105335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.107120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.107585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.107943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.110082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.111536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.113495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.115916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.116418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.119608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.121665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.123373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.125543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.125872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.126408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.547 [2024-07-25 13:40:03.127173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.129258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.131078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.131431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.134670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.135138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.136113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.138095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.138490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.140093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.141822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.143786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.145948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.146467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.149706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.151666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.153403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.155345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.155673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.156316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.157004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.158925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.161078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.161427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.164736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.165204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.166221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.168399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.168728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.170249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.172166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.174123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.175849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.176346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.179477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.181079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.183141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.184968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.185293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.185836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.186875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.188923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.190777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.191176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.194424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.194896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.195995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.197926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.198315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.199952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.201889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.203927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.205918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.206476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.209867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.211713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.213430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.213479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.213822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.215985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.216036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.216081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.216132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.216657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.220897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.221245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.222305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.222358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.222403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.222447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.222959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.223142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.223190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.223236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.223283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.223637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.548 [2024-07-25 13:40:03.224707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.224760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.224805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.224849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.225188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.225297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.225343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.225388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.225437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.225761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.227747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.228065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.229927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.230421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.231585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.231638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.231682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.231727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.232102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.232213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.232261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.232307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.232353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.232788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.233867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.233920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.233965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.234010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.234398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.234505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.234557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.234602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.234647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.235197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.236261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.236313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.236358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.236404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.236830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.236940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.236991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.237037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.237081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.237423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.238458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.238512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.238563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.238608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.239035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.239146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.239191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.239236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.239280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.239629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.240696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.240749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.240794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.240839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.241198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.241307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.549 [2024-07-25 13:40:03.241353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.241398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.241442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.241801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.243962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.244304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.245979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.246023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.246401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.247484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.247537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.247588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.247632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.247982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.248086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.248132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.248177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.248221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.248611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.249656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.249708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.249752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.249810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.250170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.250278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.250326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.250370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.250415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.250857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.251741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.251794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.251841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.251886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.252210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.252319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.252366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.252414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.252459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.252799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.253807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.253863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.253911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.253956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.254447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.254633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.254686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.254733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.254780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.255103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.256826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.257200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.258318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.258387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.258432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.258476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.258874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.258979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.259024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.259070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.550 [2024-07-25 13:40:03.259114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.259447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.260564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.260620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.260664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.260710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.261030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.261136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.261182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.261227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.261271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.261593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.263845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.264164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.265343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.265397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.265443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.265488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.265813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.265920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.265965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.266010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.266055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.266535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.267731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.267785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.267833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.267878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.268248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.268361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.268406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.268451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.268496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.268914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.270873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.271370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.272404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.272456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.272500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.272553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.272893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.273000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.273050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.273096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.273140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.273538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.274571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.274624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.274674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.274718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.275147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.275254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.275301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.275346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.275391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.275732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.276755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.276811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.276855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.276900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.277218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.277325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.277370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.277415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.277460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.277833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.279031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.279083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.551 [2024-07-25 13:40:03.279127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.279172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.279504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.279617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.279664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.279709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.279753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.280115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.281959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.282279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.283856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.283912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.283958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.284005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.284397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.284523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.284574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.284619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.284662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.284981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.286846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.287330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.288839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.288892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.288936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.288982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.289317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.289424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.289471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.289516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.289567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.290026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.291128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.291180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.291226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.292380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.292855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.292991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.294723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.296761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.298183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.298534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.299802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.300270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.300738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.301200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.301666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.303789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.305812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.307025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.309131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.309525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.311102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.552 [2024-07-25 13:40:03.311578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.312041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.312502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.312961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.313498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.313964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.314425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.314890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.315289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.317171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.317652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.318114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.318578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.319050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.319590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.320053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.320514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.320980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.321442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.323249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.323728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.324190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.324659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.325139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.325679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.326153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.326618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.327077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.327593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.329732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.330202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.330669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.331129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.331604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.332150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.332949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.553 [2024-07-25 13:40:03.334177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.336003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.336491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.338915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.339626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.340088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.341846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.342356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.343610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.344085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.345522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.346413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.346839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.349346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.351175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.351641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.352310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.352694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.354505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.354972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.355430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.357206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.357553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.360738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.361782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.362755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.363222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.363723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.364665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.366186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.366651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.367469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.367881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.369395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.371185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.372909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.373376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.373890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.375729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.376866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.377714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.378174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.378650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.380374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.381461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.382437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.384032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.815 [2024-07-25 13:40:03.384412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.385111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.386490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.388290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.389322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.389651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.392920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.393959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.395708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.397474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.397881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.399700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.400167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.400634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.401092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.401563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.403931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.405787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.406249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.407552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.407911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.410166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.411596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.413579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.415632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.415978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.419156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.421494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.422996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.425170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.425494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.426886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.427348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.429094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.431208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.431532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.434833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.435753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.436220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.438151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.438527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.440572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.442329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.444319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.446718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.447263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.450798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.452515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.454444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.456622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.456948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.457483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.458700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.460671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.462904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.463343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.466195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.466681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.468385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.470358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.470689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.472229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.474187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.476064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.477510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.477991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.481161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.482695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.484495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.486835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.487261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.487851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.490016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.492026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.493778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.494110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.495628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.496854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.498903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.501319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.501737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.503824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.505741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.816 [2024-07-25 13:40:03.507427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.507890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.508212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.510677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.512614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.514929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.515915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.516420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.518470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.520474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.522599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.524313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.524642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.526249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.528207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.530287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.532059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.532385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.534474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.536898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.537359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.538150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.538530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.541575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.543633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.545560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.546019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.546462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.548520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.550666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.552127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.554032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.554405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.557331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.559093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.561358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.562819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.563181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.565212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.566737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.567196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.568989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.569347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.572328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.574459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.575649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.576109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.576432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.578612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.580888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.582555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.584340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.584673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.588257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.590230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.591887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.594000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.594387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.596539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.597006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.598226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.600046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.600439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:22.817 [2024-07-25 13:40:03.603597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.605404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.605881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.607616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.607942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.610347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.611747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.613811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.616118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.616507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.619796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.621729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.623488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.625414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.625744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.626462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.627198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.629351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.631267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.631620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.634804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.635273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.636477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.638446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.638826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.640376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.642558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.644516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.646261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.646755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.650129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.651653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.653728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.655555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.655964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.656535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.658313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.660415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.662541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.662948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.664885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.665408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.667576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.669672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.670009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.671878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.673880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.676110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.676573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.677102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.679841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.681884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.683981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.685811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.686305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.687688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.689604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.690980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.692840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.693362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.695671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.696788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.697249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.699393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.699768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.081 [2024-07-25 13:40:03.702010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.703742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.705622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.707778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.708226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.710871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.711342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.711807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.711853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.712404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.712973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.713034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.713079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.713123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.713541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.715489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.715552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.715599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.715649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.716187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.716329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.716388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.716433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.716477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.716912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.718592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.718647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.718693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.718737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.719276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.719436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.719481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.719536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.719588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.720111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.721610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.721662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.721713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.721758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.722239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.722422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.722470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.722517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.722572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.723007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.724598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.724652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.724697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.724741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.725269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.725451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.725502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.725555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.725602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.726157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.727659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.727712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.727757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.727805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.728256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.728408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.728454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.728499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.728560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.729059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.730667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.730727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.730773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.730817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.731323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.731506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.731559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.731607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.731653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.732136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.733644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.733704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.733751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.733797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.734289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.734432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.734478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.082 [2024-07-25 13:40:03.734523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.734574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.735049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.736451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.736503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.736566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.736612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.737088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.737270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.737321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.737367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.737414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.737878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.739226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.739281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.739327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.739372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.739912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.740094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.740141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.740187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.740233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.740684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.742984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.743030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.743524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.744888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.744940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.744985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.745029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.745567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.745752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.745800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.745846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.745892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.746350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.747801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.747856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.747901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.747954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.748463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.748661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.748713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.748759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.748805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.749167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.750493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.750545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.750610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.750655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.751124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.751305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.751353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.751398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.751445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.751908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.753261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.753316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.753361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.753406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.753888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.754072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.754124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.754170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.754216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.754637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.756994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.757358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.083 [2024-07-25 13:40:03.758692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.758745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.758790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.758844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.759292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.759471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.759522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.759586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.759644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.760155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.761981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.762391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.763519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.763577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.763622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.763667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.764147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.764330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.764377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.764423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.764468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.764928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.766977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.767438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.768758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.768812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.768871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.768915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.769254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.769359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.769408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.769453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.769497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.770015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.771294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.771358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.771403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.771447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.771845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.771953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.771998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.772042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.772086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.772406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.773545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.773603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.773647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.773692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.774020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.774128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.774173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.774217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.774261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.774589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.776821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.777140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.778257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.084 [2024-07-25 13:40:03.778310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.778354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.778399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.778793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.778915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.778965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.779009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.779053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.779515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.780660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.780714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.780759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.780802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.781250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.781359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.781404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.781449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.781493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.781869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.783962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.784300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.785490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.785542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.785592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.785638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.785959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.786067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.786113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.786157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.786201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.786519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.788086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.788138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.788187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.788233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.788573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.788688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.832071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.844231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.844294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.846203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.846255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.848515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.848897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.850868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.851446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.853476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.855617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.857883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.859889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.862017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.862475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.862991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.865612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.867713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.085 [2024-07-25 13:40:03.869798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.871563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.873205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.875226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.877027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.878532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.878921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.880453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.881879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.883652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.347 [2024-07-25 13:40:03.885760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.888430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.890517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.892269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.892742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.893097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.895556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.897719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.899835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.901017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.903266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.905246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.907355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.908722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.909121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.910684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.912473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.914476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.916594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.918983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.920898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.922323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.922784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.923106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.925882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.927817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.929930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.930676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.933378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.935376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.937281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.939018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.939351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.941229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.943364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.945281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.946934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.949446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.951566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.952024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.953115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.953483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.956542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.958673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.960425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.960900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.963174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.965325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.966770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.968739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.969127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.972112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.973914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.976333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.977747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.980209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.981737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.982197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.984030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.984412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.987195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.989472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.990386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.990855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.993275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.995467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.997101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.998901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:03.999223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.002318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.004345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.005719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.007760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.008648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.009123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.009601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.010058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.010461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.012900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.015088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.016858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.018513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.020615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.348 [2024-07-25 13:40:04.022332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.023940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.025714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.026036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.029180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.030929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.032657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.034414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.035469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.035936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.036401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.036863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.037291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.039060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.039528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.039993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.040452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.041461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.041928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.042388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.042851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.043351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.045143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.045618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.046081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.046540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.047525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.047991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.048451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.048911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.049355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.051237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.051710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.052171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.052634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.053677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.054141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.054603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.055060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.055622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.057513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.057990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.058453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.058919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.059930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.060394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.060858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.061316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.061890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.063907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.064376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.064841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.065299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.066276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.066741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.067201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.067661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.068131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.069920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.070388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.070853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.071311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.072285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.072765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.073889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.074816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.075140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.077665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.079639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.081875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.082412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.085122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.086191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.088242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.089078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.089561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.092510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.094210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.096003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.096467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.098418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.100191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.101377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.102272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.349 [2024-07-25 13:40:04.102614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.105472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.106576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.108626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.109868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.112040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.113223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.114173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.116281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.116670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.119666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.121846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.123634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.125132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.127706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.129719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.131749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.133468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.133799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.135841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.350 [2024-07-25 13:40:04.137022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.139039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.140379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.142481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.144635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.146688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.148555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.149035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.151934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.154088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.155766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.157520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.159955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.160855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.162409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.162659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.165340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.167290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.169132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.170920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.171366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.173140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.173188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.174775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.175182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.178199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.180006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.181768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.183317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.183780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.185148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.185194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.613 [2024-07-25 13:40:04.187132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.187471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.188632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.190748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.192711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.192757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.193379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.194668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.194714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.196461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.196849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.199803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.199857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.201949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.201995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.202490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.203758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.203805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.204507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.204858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.207782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.207837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.210019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.210065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.210080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.210398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.210536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.212018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.212066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.213890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.214267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.215389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.217160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.217208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.217253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.217574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.217703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.219874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.219921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.221715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.222214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.223426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.223479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.223524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.223586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.223940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.224046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.225801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.225849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.227794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.228136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.229675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.229729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.229775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.229819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.230139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.230269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.230315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.230364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.230409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.230773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.231854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.231905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.231950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.231994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.232314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.232422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.232468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.232513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.232562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.233000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.234417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.234469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.234515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.234567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.234922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.235030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.235075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.235120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.235164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.235512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.236636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.614 [2024-07-25 13:40:04.236688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.236733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.236777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.237159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.237267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.237313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.237362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.237406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.237771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.239973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.240017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.240357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.241472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.241524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.241572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.241617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.241997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.242107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.242153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.242197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.242241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.242608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.244838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.245198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.246322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.246374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.246419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.246463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.246840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.246950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.246996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.247041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.247084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.247430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.248932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.248984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.249034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.249079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.249415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.249523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.249573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.249622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.249666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.250006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.251897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.252248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.253351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.253407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.253451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.253495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.253867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.253975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.254026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.254071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.254115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.254483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.255604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.255657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.255702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.255747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.256140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.615 [2024-07-25 13:40:04.256247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.256293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.256338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.256382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.256748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.261781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.261837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.261886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.261931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.262271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.262380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.262426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.262474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.262518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.262877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.267891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.268275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.273972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.274017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.274333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.278527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.278586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.278631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.278675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.279054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.279160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.279205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.279249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.279294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.279661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.284643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.284700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.284749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.284792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.285169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.285273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.285318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.285366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.285410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.285761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.290222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.290277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.290321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.290367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.290891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.291072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.291119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.291165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.291224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.291713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.296744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.296798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.296843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.296887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.297248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.297354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.297399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.297443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.297487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.297814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.301307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.301362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.301407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.301451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.301814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.301921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.301967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.302011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.302055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.616 [2024-07-25 13:40:04.302400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.307442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.307497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.307551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.307595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.308072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.308254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.308301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.308350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.308397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.308756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.313513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.313573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.313619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.313664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.314006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.314113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.314159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.314203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.314247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.314719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.320909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.321240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.325694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.325749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.325794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.325839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.326158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.326266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.326312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.326357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.326401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.326747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.331642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.331697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.331746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.331791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.332156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.332262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.332308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.332356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.332400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.332784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.335996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.336056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.336101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.336146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.336583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.337641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.337690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.339134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.339181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.339581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.342368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.342423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.342467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.344450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.344820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.346632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.346682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.348761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.348808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.349191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.349690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.349742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.350843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.350889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.351413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.353207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.353259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.355260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.355307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.355672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.358569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.358625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.360380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.617 [2024-07-25 13:40:04.360427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.360897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.362909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.362958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.364077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.364123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.364525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.367220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.367274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.368554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.369314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.369651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.371195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.371244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.372333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.372380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.372892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.375008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.375110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.375862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.375912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.376372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.376419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.376775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.378191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.378666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.380095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.380978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.381385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.381945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.382007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.383099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.383435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.385012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.386799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.388058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.388817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.389241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.389380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.390870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.391755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.392941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.393379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.396313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.396784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.397305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.398853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.399178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.399738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.618 [2024-07-25 13:40:04.400201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.401922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.403814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.404334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.406428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.407554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.408015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.409387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.409935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.411668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.412132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.413036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.414350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.414677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.417656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.419037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.419661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.420125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.420492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.421578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.422583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.423051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.424560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.425043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.426894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.428147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.430013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.430474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.430904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.432779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.434478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.434947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.435413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.435740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.437494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.438995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.439858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.441055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.441496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.442633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.443668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.445317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.445779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.446305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.447745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.448212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.450001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.451583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.452068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.452632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.454447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.455728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.456481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.456919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.459724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.460190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.461246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.462309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.462637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.463564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.464710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.466468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.468277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.943 [2024-07-25 13:40:04.468637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.472017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.473911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.475644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.477822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.478146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.479790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.480690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.481846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.482309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.482799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.484860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.485809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.487565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.489342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.489799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.490972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.492726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.494955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.496052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.496417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.498351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.499620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.500080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.501925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.502387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.503520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.505288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.506771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.508517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.508864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.511018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.512970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.515082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.516483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.516900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.519115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.520896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.522684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.524273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.524776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.527802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.529545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.531459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.533690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.534090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.535111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.536696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.538445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.540352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.540679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.543660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.545148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.546770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.548571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.548995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.551196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.553176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.554677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.556639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.556987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.559259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.560022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.562187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.564107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.564435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.566511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.568444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.570624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.572405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.572854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.576009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.578169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.579795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.581543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.581872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.583517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.584447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.586028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.587756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.588080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.591001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.592887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.944 [2024-07-25 13:40:04.594537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.596038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.596362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.597627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.599655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.601781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.603168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.603516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.606284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.607734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.608489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.610448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.610817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.612692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.614664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.616517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.618635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.619101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.622097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.623914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.626130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.627518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.627862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.629817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.631368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.633049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.634831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.635381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.638381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.640130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.642120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.644325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.644730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.645686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.647207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.648977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.650775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.651097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.654225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.655926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.657485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.659286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.659719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.661781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.663693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.665074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.667238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.667605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.670006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.670723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.672752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.674790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.675158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.677100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.679040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.681203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.682981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.683515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.686675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.688891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.690652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.692505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.692832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.694472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.695321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.696854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.698595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.698920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.701994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.703883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.705535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.706999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.707325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.708759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.710621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.712865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.714384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.714754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.717480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.719252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.720231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:23.945 [2024-07-25 13:40:04.722329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.722709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.724353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.726296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.728317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.730382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.730712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.733692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.735632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.223 [2024-07-25 13:40:04.737887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.739381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.739772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.742235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.742286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.743763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.743810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.744297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.747278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.749494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.750844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.752718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.753043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.754624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.754678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.755756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.755803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.756130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.759537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.761007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.761055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.762915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.763238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.764959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.765010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.765860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.765907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.766380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.769505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.769563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.771044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.771092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.771415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.773867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.773915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.775640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.775687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.776154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.779140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.779194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.781365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.781412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.781747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.783883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.783936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.786091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.786138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.786499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.789620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.789673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.789724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.789769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.790171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.792039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.792088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.794020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.794067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.794452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.796557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.796611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.796656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.796701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.797210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.799474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.799523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.801438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.801485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.801927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.803025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.803078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.803123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.803168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.803588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.805270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.805320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.805364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.805421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.805749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.806780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.806831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.806880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.224 [2024-07-25 13:40:04.806926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.807246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.807353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.807400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.807445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.807490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.807975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.809815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.810223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.811980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.812025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.812345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.813378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.813431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.813476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.813524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.813852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.813959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.814005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.814050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.814098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.814436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.815428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.815483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.815529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.815579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.815901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.816009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.816053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.816099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.816143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.816497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.817583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.817636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.817681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.817725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.818070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.818177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.818223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.818277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.818321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.818703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.819731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.819782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.819827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.819872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.820258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.820363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.820408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.820452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.820496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.820951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.821953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.822984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.823971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.824028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.824073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.824117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.824438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.225 [2024-07-25 13:40:04.824544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.824597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.824643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.824687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.825020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.826756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.827099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.828989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.829033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.829354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.830786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.830839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.830888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.830932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.831392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.831576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.831623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.831669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.831720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.832251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.833365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.833420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.833469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.833514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.833964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.834148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.834196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.834243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.834289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.834697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.836880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.837209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.838739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.838796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.838842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.838887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.839208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.839316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.839361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.839405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.839450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.839775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.841916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.842237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.843478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.843533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.843587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.843632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.843973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.844078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.844125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.844169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.226 [2024-07-25 13:40:04.844223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.844544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.845732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.845783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.845828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.845872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.846356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.846536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.846588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.846634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.846680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.847155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.848461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.848512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.848561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.848606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.849120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.849299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.849350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.849397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.849441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.850014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.851372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.851423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.851468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.851512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.852029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.852208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.852256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.852301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.852350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.852880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.854262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.854321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.854366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.854422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.854927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.855116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.855164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.855209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.855256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.855722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.857352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.857404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.857448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.857493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.858002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.858182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.858234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.858280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.858326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.858880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.860213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.860266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.860311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.860368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.860918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.861100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.861164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.861211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.861256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.861690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.863371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.863424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.863469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.863514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.863978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.864156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.864203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.864249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.864301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.864846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.866167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.866224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.866269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.866324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.866861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.867057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.867519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.867587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.868046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.868490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.227 [2024-07-25 13:40:04.870042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.870096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.870144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.870190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.870646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.870773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.871234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.871280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.871743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.872183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.873424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.873488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.873953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.874000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.874542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.874680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.875153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.875203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.875664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.876116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.877555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.878021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.878068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.878526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.878996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.879149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.879613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.879660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.880124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.880596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.881667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.883596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.883644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.885796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.886200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.886339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.888105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.888153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.890293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.890696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.891829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.893430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.894192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.895468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.895798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.895940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.897598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.897657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.898114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.898437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.899693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.900160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.902165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.903536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.903948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.904088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.905403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.905452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.905917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.906273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.907906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.908980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.909896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.911663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.912200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.912339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.914008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.915749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.916207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.916681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.919415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.228 [2024-07-25 13:40:04.921096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.921569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.923301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.923629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.925911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.927643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.929486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.931622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.932013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.935031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.937048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.938693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.940583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.940923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.942081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.942593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.944462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.946638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.947041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.949899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.950860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.951445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.953392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.953748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.955714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.957467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.959437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.961406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.961837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.965077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.966906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.968661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.970801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.971153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.971836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.972923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.975057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.976917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.977311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.980298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.980769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.982376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.984158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.984540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.986389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.988362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.990153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.991949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.992503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.995331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.996805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:04.998746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:05.000768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:05.001121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:05.003047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:05.004387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:05.005188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:05.007236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.229 [2024-07-25 13:40:05.007592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.010696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.012861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.014618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.015767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.016257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.018304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.020317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.022329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.023981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.024315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.025785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.027707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.029579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.031655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.032023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.033846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.035812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.036926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.037394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.037785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.040395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.042182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.044176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.045275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.045750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.048014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.049950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.051862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.053612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.054015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.055563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.057649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.059607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.061444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.061772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.063840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.065805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.066695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.067356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.067706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.070470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.072485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.074486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.075315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.075858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.078077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.080104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.081869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.083640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.084015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.085704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.087716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.089776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.091531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.091858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.094002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.095902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.096384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.097531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.097918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.101043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.103030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.104920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.105380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.105756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.107665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.109755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.492 [2024-07-25 13:40:05.111457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.113439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.113823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.116701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.118454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.120329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.122106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.122466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.124519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.126239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.126703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.128418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.128778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.131804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.133601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.135440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.136840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.137167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.138538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.139947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.141739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.143464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.143862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.145359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.146560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.148583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.150469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.150881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.153151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.155233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.157028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.157488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.157816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.160431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.162494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.164259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.165958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.166285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.168164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.169352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.170153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.171812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.172247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.175307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.177073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.178686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.179148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.179472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.181561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.182933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.185148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.186915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.187449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.190863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.192195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.194089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.195857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.196404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.196960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.199138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.200156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.202288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.202680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.205635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.207734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.209211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.210986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.211331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.212772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.213235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.214715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.216477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.217011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.218559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.220079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.221816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.222277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.222748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.223299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.223775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.224241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.224707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.493 [2024-07-25 13:40:05.225258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.227278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.227750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.228211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.228675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.229157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.229705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.230168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.230634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.231093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.231577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.233162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.233634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.234102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.234566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.235051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.235594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.236057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.236518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.236982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.237479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.239177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.239649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.240110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.240579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.241045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.241592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.242055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.242522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.242987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.243442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.245082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.245555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.246019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.246481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.246983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.247537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.247594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.248055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.248103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.248646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.250585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.251064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.251526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.251991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.252431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.252988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.253038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.253498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.253555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.254112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.255816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.256288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.256337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.256807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.257133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.258452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.258506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.260749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.260798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.261250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.264123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.264176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.265494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.265541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.265897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.267812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.267861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.268319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.268366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.268745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.271727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.271783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.272746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.272794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.273276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.273845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.273895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.275640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.275687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.276092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.279203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.279258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.494 [2024-07-25 13:40:05.279306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.495 [2024-07-25 13:40:05.279350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.495 [2024-07-25 13:40:05.279879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.495 [2024-07-25 13:40:05.280440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.495 [2024-07-25 13:40:05.280489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.495 [2024-07-25 13:40:05.280953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.495 [2024-07-25 13:40:05.281002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.281494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.284157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.284214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.284260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.284304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.284698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.286846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.286897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.287354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.287401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.287876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.289133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.289186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.289237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.289281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.289632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.290911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.290961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.291006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.291050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.291414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.292867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.292920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.292965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.293012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.293450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.293603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.293650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.293694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.293739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.294060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.295974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.296505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.297918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.297970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.298014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.298059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.298381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.298490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.298536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.298585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.298631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.299038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.300148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.300199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.300244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.300288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.300748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.300932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.300980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.301037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.301087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.759 [2024-07-25 13:40:05.301523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.302623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.302679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.302724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.302768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.303168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.303278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.303323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.303368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.303413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.303738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.304864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.304917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.304963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.305007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.305558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.305741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.305788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.305834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.305882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.306201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.307352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.307404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.307448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.307493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.307822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.307934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.307984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.308029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.308073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.308391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.309966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.310766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.311087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.312949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.313444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.314621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.314676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.314721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.314765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.315086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.315192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.315245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.315289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.315333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.315658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.316647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.316701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.316746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.316790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.317290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.317469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.317515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.317565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.317611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.317931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.318953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.319676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.320018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.321478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.321535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.760 [2024-07-25 13:40:05.321587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.321631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.321950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.322087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.322132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.322182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.322226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.322562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.323554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.323606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.323650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.323695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.324035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.324146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.324192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.324236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.324279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.324661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.325741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.325792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.325837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.325881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.326268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.326375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.326423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.326467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.326511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.326964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.328819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.329333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.330396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.330452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.330498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.330542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.330928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.331035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.331080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.331124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.331168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.331506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.332566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.332617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.332661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.332707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.333197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.333382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.333429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.333476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.333522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.333906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.334986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.335747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.336065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.337270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.337329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.337373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.337417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.337819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.337924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.337970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.338014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.338062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.338381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.339486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.761 [2024-07-25 13:40:05.339542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.339593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.339638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.339958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.340064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.340109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.340154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.340198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.340680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.341909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.341960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.342004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.342048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.342444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.342561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.342607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.342652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.342701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.343137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.343972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.344694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.345161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.346889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.347240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.351010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.351070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.351114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.351159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.351544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.351650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.353786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.353836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.355231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.355588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.356620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.356673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.356718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.356763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.357244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.357435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.359485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.359533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.361476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.361803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.362782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.362834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.364963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.365011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.365507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.365670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.367112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.367159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.369107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.369431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.374052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.374521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.374573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.376455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.376812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.376953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.379084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.379132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.380880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.381240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.762 [2024-07-25 13:40:05.384924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.386966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.387016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.388653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.388976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.389108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.391218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.391266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.393043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.393556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.397628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.399780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.401672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.403315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.403795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.403934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.405664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.405712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.407884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.408209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.412612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.413081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.415173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.417194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.417518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.417680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.419484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.419532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.421455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.421784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.427460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.429312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.431321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.433553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.434093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.434231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.435304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.436251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.438028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.438550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.443123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.443596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.445355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.447528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.447924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.450056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.451999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.452461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.453565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.453938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.459411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.460185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.462322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.463057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.463381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.464302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.464922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.466991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.468999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.469329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.474841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.476614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.477797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.478602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.479107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.481201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.483312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.485348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.487106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.487432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.492794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.494553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.496296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.496762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.497242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.497796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.498261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.498731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.499190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.763 [2024-07-25 13:40:05.499657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.502368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.502843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.503304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.503769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.504250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.504805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.505268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.505734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.506204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.506693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.509353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.509831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.510293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.510757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.511202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.511759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.512222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.512687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.513147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.513702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.516382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.516854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.517317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.517781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.518267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.518835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.519298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.519765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.520223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.520773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.523445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.523919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.524381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.524844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.525317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.525884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.526347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.526816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.527276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.527801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.533914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.536039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.537859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.539607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.539952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.542330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.542799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.543271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.543733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:24.764 [2024-07-25 13:40:05.544261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.547270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.547743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.548205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.550372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.550702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.551930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.553897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.555680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.556151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.556613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.558761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.560955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.561749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.563910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.564318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.564875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.566426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.568171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.568680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.026 [2024-07-25 13:40:05.569162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.575162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.576843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.577310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.577863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.578218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.579412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.581585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.582699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.583159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.583485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.590196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.591519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.591984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.594026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.594402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.596391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.598066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.600023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.601931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.602288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.607579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.609522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.611448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.612000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.612480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.614591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.616568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.618727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.620450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.620781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.626311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.628061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.630072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.632074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.632444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.633047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.634511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.636357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.638244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.638597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.644002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.646171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.648243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.650012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.650337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.652506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.654613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.655827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.656287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.656629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.663280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.664467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.664934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.667102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.667453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.669443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.671272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.673380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.675325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.675807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.681091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.683085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.685124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.685944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.686444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.688542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.690544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.692417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.694162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.694521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.700010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.701764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.703747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.705830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.027 [2024-07-25 13:40:05.706220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.706773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.707819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.709952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.711761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.712154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.717699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.719721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.721792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.723534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.723864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.725838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.728000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.728765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.729460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.729806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.736200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.737611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.738078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.740058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.740416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.742495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.744133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.746095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.748137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.748527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.753585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.755568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.757426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.758407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.758887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.760951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.763103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.765093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.766825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.767148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.772876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.774745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.776481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.778644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.779019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.779885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.780782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.782959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.784688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.785025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.790641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.792537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.794454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.796326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.796745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.798628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.800566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.801584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.802043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.802415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.809204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.810431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.810895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.811355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.811814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.812368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.812417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.814435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.814482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.028 [2024-07-25 13:40:05.814837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.820495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.822426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.824186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.826134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.826492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.827966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.828016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.828475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.828521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.828897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.831424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.831893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.831941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.833242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.833592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.835397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.835447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.837295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.837342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.837721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.843258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.843325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.845276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.845322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.845684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.846240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.846290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.847261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.847308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.847709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.852684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.852739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.854714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.854765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.855089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.857059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.857109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.859088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.859134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.859519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.864728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.864784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.864830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.864874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.865285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.291 [2024-07-25 13:40:05.865843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.865900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.867673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.867729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.868172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.871535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.871594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.871640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.871685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.872129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.872693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.872743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.873202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.873250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.873715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.876109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.876165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.876210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.876265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.876783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.877389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.877437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.877491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.877536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.877982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.880159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.880214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.880265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.880311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.880824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.881008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.881060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.881107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.881153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.881611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.884071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.884130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.884175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.884220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.884737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.884922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.884988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.885034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.885082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.885541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.887899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.887958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.888003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.888047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.888474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.888586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.888633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.888678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.888726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.889171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.891281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.891337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.891386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.891431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.891907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.892088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.892144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.892194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.892243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.892715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.894931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.894991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.895037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.895088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.895580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.895767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.895817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.895863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.895909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.896355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.898754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.898810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.898854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.898899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.292 [2024-07-25 13:40:05.899369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.899560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.899608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.899655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.899706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.900191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.902637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.902692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.902737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.902781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.903182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.903289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.903335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.903385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.903431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.903861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.907894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.907949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.907994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.908038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.908373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.908480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.908526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.908575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.908624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.909011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.911803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.911862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.911907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.911951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.912449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.912637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.912686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.912732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.912779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.913251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.917879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.918200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.922822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.923356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.925534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.925595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.925640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.925684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.926023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.926130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.926176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.926221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.926271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.926618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.930150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.930209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.930254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.930311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.930751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.930931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.930979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.931025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.931084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.931650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.935880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.935936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.935981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.936027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.293 [2024-07-25 13:40:05.936401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.936514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.936565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.936610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.936653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.937086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.939962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.940005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.940327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.944961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.945289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.949769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.950241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.955995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.956340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.961265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.961320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.961367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.961411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.961971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.962151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.962197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.962242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.962288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.962611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.966947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.967664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.968187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.972807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.973124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.977047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.977102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.977147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.977199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.977566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.294 [2024-07-25 13:40:05.977674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.977733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.977777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.977821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.978218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.981807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.981864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.981924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.981968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.982310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.982419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.982465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.982509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.982558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.982880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.987853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.987910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.987965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.988011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.988477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.988657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.988703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.988748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.988797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.989152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.994483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.994542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.994593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.994638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.995083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.995258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.995789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.995836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.998021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:05.998368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.003130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.003190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.003239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.003285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.003752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.003931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.005645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.005693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.007694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.008017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.012285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.012340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.012802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.012850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.013263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.013407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.015598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.015645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.017122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.017529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.021296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.023348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.023396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.025541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.025989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.026135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.028026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.028074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.029984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.030389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.033622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.035135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.035182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.036959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.037283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.037422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.038322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.038371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.038877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.039272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.044154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.046311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.046777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.047697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.048059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.295 [2024-07-25 13:40:06.048197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.050257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.050304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.051703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.052102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.055692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.057465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.059687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.061098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.061493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.061632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.063536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.063587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.065001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.065490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.072014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.074047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.075842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.076302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.076778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.076916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.296 [2024-07-25 13:40:06.079062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.081061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.082388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.082792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.088524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.090354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.091939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.094094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.094439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.096477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.096945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.098070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.100014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.100402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.105835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.106511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.108427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.110498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.110825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.112748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.114764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.116949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.117409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.117918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.122110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.123528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.125326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.126884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.127354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.129166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.131126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.132584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.134608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.134969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.140317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.142293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.142767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.143227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.143585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.145858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.146983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.149146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.150900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.151419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.157369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.159167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.159641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.160526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.160853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.161793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.164016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.164477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.165211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.165577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.172563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.173032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.173503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.175245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.175728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.176824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.177287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.179398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.181431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.181757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.187037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.188783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.190524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.192084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.192474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.193027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.193497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.193966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.194436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.194878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.197803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.198271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.198736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.199194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.199644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.558 [2024-07-25 13:40:06.200888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.201781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.203268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.203730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.204192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.209267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.211060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.211521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.211984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.212313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.214073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.214551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.215012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.216810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.217201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.222352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.222824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.223887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.224847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.225172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.225745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.226636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.227814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.229603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.230007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.233942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.235747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.237348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.237811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.238284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.238839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.240605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.241064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.241523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.241996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.245336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.245809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.247038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.247878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.248352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.248908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.250725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.251190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.251651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.252181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.257663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.258180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.258916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.261066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.261412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.263109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.264859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.266371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.266830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.267294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.271473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.271950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.272557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.274498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.274860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.276271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.278195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.279976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.280434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.280884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.284480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.286031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.287778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.290102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.290539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.291093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.291558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.292028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.292487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.292814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.296565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.298256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.299678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.300236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.300757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.302051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.302821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.304377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.304849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.305183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.308859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.310038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.312047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.559 [2024-07-25 13:40:06.312508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.313035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.315051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.317027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.318782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.320856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.321209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.324697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.326774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.328484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.330229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.330605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.331982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.332445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.334460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.336531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.336892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.341231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.341702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.343568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.345549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.560 [2024-07-25 13:40:06.345895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.347710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.349897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.351880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.352485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.352972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.357789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.359914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.361868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.362384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.362894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.365142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.366909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.368626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.370538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.370900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.374529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.376280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.377908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.379836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.380208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.382248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.382719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.383943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.385900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.386228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.391028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.391494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.393227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.395093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.395477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.397246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.399204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.401007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.402771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.403289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.407870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.409813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.411681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.413447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.413963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.415799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.417613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.419491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.421253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.421622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.426306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.428325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.430298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.431910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.432287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.434276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.435574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.436043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.438080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.438432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.443210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.443729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.444864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.447018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.447344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.449160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.449211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.451172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.451218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.451618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.455142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.823 [2024-07-25 13:40:06.457111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.458710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.460439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.460788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.462220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.462270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.462733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.462779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.463169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.468465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.470442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.470493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.470956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.471366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.473383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.473433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.475417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.475465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.475862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.479442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.479499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.481358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.481405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.481784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.483934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.483983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.485742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.485791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.486149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.490212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.490267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.492334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.492380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.492718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.494902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.494952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.496887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.496937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.497259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.501972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.502028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.502083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.502128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.502476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.503556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.503607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.504067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.504131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.504615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.508526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.508584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.508630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.508674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.509034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.511283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.511333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.513076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.513135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.513494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.516423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.516478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.516522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.516571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.516933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.518606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.518655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.518700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.518745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.519066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.521907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.521962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.522007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.522051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.824 [2024-07-25 13:40:06.522455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.522574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.522620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.522665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.522713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.523192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.526353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.526409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.526453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.526498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.526887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.526996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.527042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.527087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.527131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.527526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.528855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.528910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.528956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.529002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.529398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.529543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.529596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.529641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.529695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.530032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.531238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.531291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.531350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.531395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.531890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.532072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.532119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.532166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.532217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.532544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.533614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.533673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.533718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.533765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.534087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.534198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.534252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.534296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.534341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.534814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.535918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.535971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.536015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.536060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.536412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.536522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.536573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.536619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.536663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.537047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.538941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.539327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.540431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.540483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.540527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.540576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.541046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.541226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.541274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.541320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.541371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.541734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.542869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.542930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.542975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.543020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.543406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.543513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.543564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.543609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.543653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.544050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.545630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.545683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.545728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.545772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.546092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.825 [2024-07-25 13:40:06.546200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.546246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.546291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.546335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.546793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.547989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.548778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.549214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.550590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.550643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.550689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.550736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.551158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.551309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.551356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.551401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.551449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.551944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.553272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.553338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.553382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.553426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.553893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.554073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.554121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.554172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.554218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.554700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.556293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.556347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.556391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.556435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.556965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.557147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.557194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.557241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.557286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.557788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.559245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.559298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.559343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.559390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.559872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.560025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.560072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.560116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.560166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.560711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.562276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.562329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.562375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.562420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.562940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.563120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.563167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.563213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.563264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.563742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.565991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.566037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.566532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.826 [2024-07-25 13:40:06.567848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.567910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.567956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.568001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.568509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.568695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.568754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.568801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.568847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.569291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.570650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.570703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.570754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.570799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.571366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.571552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.571600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.571646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.571692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.572150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.573738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.573791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.573841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.573886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.574321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.574465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.574513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.574563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.574612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.575113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.576447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.576503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.576551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.576597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.577086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.577283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.577330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.577376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.577436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.577936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.579510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.579566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.579612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.579656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.580196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.580376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.580423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.580469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.580515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.581082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.582530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.582587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.582638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.582685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.583090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.583233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.583287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.583332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.583389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.583939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.585989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.586388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.587483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.587541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.587591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.587635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.588107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.588283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.589281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.589328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.590132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.590474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.592085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.592140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.592187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.592239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.592632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.592772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.594492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.594539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.594999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.827 [2024-07-25 13:40:06.595430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.596550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.596603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.598847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.598894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.599445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.599588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.600464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.600511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.601423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.601748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.603105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.604804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.604853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.605986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.606449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.606593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.607146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.607192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.609268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.609743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:25.828 [2024-07-25 13:40:06.610809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.611273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.611321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.611788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.612295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.612436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.613026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.613073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.615079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.615431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.616616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.618347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.618811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.619269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.619735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.619884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.088 [2024-07-25 13:40:06.620344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.620390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.622134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.622457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.623523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.625754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.626245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.626721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.627176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.627313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.627779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.627825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.629569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.629912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.631381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.632736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.634436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.636627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.637071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.637210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.639047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.641252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.642823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.643320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.646368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.647786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.649818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.651789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.652115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.652676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.654414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.656463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.658697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.659159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.661749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.662216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.663616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.665417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.089 [2024-07-25 13:40:06.665746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:26.660 00:31:26.660 Latency(us) 00:31:26.660 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:26.660 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:26.660 Verification LBA range: start 0x0 length 0x100 00:31:26.660 crypto_ram : 5.75 46.23 2.89 0.00 0.00 2676672.46 11494.01 2271376.94 00:31:26.660 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:26.660 Verification LBA range: start 0x100 length 0x100 00:31:26.660 crypto_ram : 6.07 36.57 2.29 0.00 0.00 3267949.23 97598.23 3290915.45 00:31:26.660 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:26.660 Verification LBA range: start 0x0 length 0x100 00:31:26.660 crypto_ram2 : 5.76 47.07 2.94 0.00 0.00 2542117.91 10284.11 2168132.53 00:31:26.660 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:26.660 Verification LBA range: start 0x100 length 0x100 00:31:26.660 crypto_ram2 : 6.10 40.02 2.50 0.00 0.00 2916552.00 89128.96 3290915.45 00:31:26.661 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:26.661 Verification LBA range: start 0x0 length 0x100 00:31:26.661 crypto_ram3 : 5.59 324.64 20.29 0.00 0.00 350317.74 3302.01 477505.38 00:31:26.661 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:26.661 Verification LBA range: start 0x100 length 0x100 00:31:26.661 crypto_ram3 : 5.78 213.30 13.33 0.00 0.00 514877.62 42346.34 613013.66 00:31:26.661 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:26.661 Verification LBA range: start 0x0 length 0x100 00:31:26.661 crypto_ram4 : 5.71 342.79 21.42 0.00 0.00 321000.07 5368.91 454920.66 00:31:26.661 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:26.661 Verification LBA range: start 0x100 length 0x100 00:31:26.661 crypto_ram4 : 5.93 237.59 14.85 0.00 0.00 452275.52 28634.19 471052.60 00:31:26.661 =================================================================================================================== 00:31:26.661 Total : 1288.21 80.51 0.00 0.00 724714.43 3302.01 3290915.45 00:31:26.921 00:31:26.921 real 0m8.983s 00:31:26.921 user 0m17.306s 00:31:26.921 sys 0m0.299s 00:31:26.921 13:40:07 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:26.921 13:40:07 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:26.921 ************************************ 00:31:26.921 END TEST bdev_verify_big_io 00:31:26.921 ************************************ 00:31:26.921 13:40:07 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:26.921 13:40:07 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:31:26.921 13:40:07 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:26.921 13:40:07 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:26.921 ************************************ 00:31:26.921 START TEST bdev_write_zeroes 00:31:26.921 ************************************ 00:31:26.921 13:40:07 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:27.181 [2024-07-25 13:40:07.739102] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:27.181 [2024-07-25 13:40:07.739148] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1095560 ] 00:31:27.181 [2024-07-25 13:40:07.828478] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:27.181 [2024-07-25 13:40:07.906078] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:27.181 [2024-07-25 13:40:07.927143] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:27.181 [2024-07-25 13:40:07.935167] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:27.181 [2024-07-25 13:40:07.943185] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:27.444 [2024-07-25 13:40:08.026049] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:29.987 [2024-07-25 13:40:10.188911] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:29.987 [2024-07-25 13:40:10.188957] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:29.987 [2024-07-25 13:40:10.188965] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.987 [2024-07-25 13:40:10.196931] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:29.987 [2024-07-25 13:40:10.196941] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:29.987 [2024-07-25 13:40:10.196952] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.987 [2024-07-25 13:40:10.204951] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:29.987 [2024-07-25 13:40:10.204960] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:29.987 [2024-07-25 13:40:10.204966] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.987 [2024-07-25 13:40:10.212971] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:29.987 [2024-07-25 13:40:10.212981] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:29.987 [2024-07-25 13:40:10.212986] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:29.987 Running I/O for 1 seconds... 00:31:30.557 00:31:30.557 Latency(us) 00:31:30.557 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:30.557 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:30.557 crypto_ram : 1.02 2331.43 9.11 0.00 0.00 54518.12 4688.34 64931.05 00:31:30.557 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:30.557 crypto_ram2 : 1.02 2337.16 9.13 0.00 0.00 54132.25 4637.93 60494.77 00:31:30.557 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:30.557 crypto_ram3 : 1.02 18041.05 70.47 0.00 0.00 7006.23 2142.52 8922.98 00:31:30.557 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:30.557 crypto_ram4 : 1.02 18078.14 70.62 0.00 0.00 6973.08 2155.13 7309.78 00:31:30.557 =================================================================================================================== 00:31:30.557 Total : 40787.79 159.33 0.00 0.00 12427.11 2142.52 64931.05 00:31:30.846 00:31:30.846 real 0m3.876s 00:31:30.846 user 0m3.610s 00:31:30.846 sys 0m0.231s 00:31:30.846 13:40:11 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:30.846 13:40:11 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:30.846 ************************************ 00:31:30.846 END TEST bdev_write_zeroes 00:31:30.846 ************************************ 00:31:30.846 13:40:11 blockdev_crypto_aesni -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:30.846 13:40:11 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:31:30.846 13:40:11 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:30.846 13:40:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:30.847 ************************************ 00:31:30.847 START TEST bdev_json_nonenclosed 00:31:30.847 ************************************ 00:31:30.847 13:40:11 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:31.106 [2024-07-25 13:40:11.694208] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:31.106 [2024-07-25 13:40:11.694261] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1096190 ] 00:31:31.106 [2024-07-25 13:40:11.784035] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:31.106 [2024-07-25 13:40:11.860834] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:31.106 [2024-07-25 13:40:11.860890] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:31.106 [2024-07-25 13:40:11.860902] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:31.106 [2024-07-25 13:40:11.860909] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:31.367 00:31:31.367 real 0m0.291s 00:31:31.367 user 0m0.175s 00:31:31.367 sys 0m0.114s 00:31:31.367 13:40:11 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:31.367 13:40:11 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:31.367 ************************************ 00:31:31.367 END TEST bdev_json_nonenclosed 00:31:31.367 ************************************ 00:31:31.367 13:40:11 blockdev_crypto_aesni -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:31.367 13:40:11 blockdev_crypto_aesni -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:31:31.367 13:40:11 blockdev_crypto_aesni -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:31.367 13:40:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:31.367 ************************************ 00:31:31.367 START TEST bdev_json_nonarray 00:31:31.367 ************************************ 00:31:31.367 13:40:11 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:31.367 [2024-07-25 13:40:12.061638] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:31.367 [2024-07-25 13:40:12.061682] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1096212 ] 00:31:31.367 [2024-07-25 13:40:12.149313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:31.626 [2024-07-25 13:40:12.220784] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:31.626 [2024-07-25 13:40:12.220845] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:31.626 [2024-07-25 13:40:12.220855] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:31.626 [2024-07-25 13:40:12.220861] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:31.626 00:31:31.626 real 0m0.280s 00:31:31.626 user 0m0.168s 00:31:31.626 sys 0m0.110s 00:31:31.626 13:40:12 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:31.626 13:40:12 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:31.626 ************************************ 00:31:31.626 END TEST bdev_json_nonarray 00:31:31.626 ************************************ 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@786 -- # [[ crypto_aesni == bdev ]] 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@793 -- # [[ crypto_aesni == gpt ]] 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@797 -- # [[ crypto_aesni == crypto_sw ]] 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # cleanup 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:31:31.626 13:40:12 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:31:31.626 00:31:31.626 real 1m9.244s 00:31:31.626 user 2m46.156s 00:31:31.626 sys 0m6.549s 00:31:31.626 13:40:12 blockdev_crypto_aesni -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:31.626 13:40:12 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:31.626 ************************************ 00:31:31.626 END TEST blockdev_crypto_aesni 00:31:31.626 ************************************ 00:31:31.626 13:40:12 -- spdk/autotest.sh@362 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:31.626 13:40:12 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:31.626 13:40:12 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:31.626 13:40:12 -- common/autotest_common.sh@10 -- # set +x 00:31:31.626 ************************************ 00:31:31.626 START TEST blockdev_crypto_sw 00:31:31.626 ************************************ 00:31:31.626 13:40:12 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:31.886 * Looking for test storage... 00:31:31.886 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # uname -s 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@681 -- # test_type=crypto_sw 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # crypto_device= 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # dek= 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # env_ctx= 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == bdev ]] 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@689 -- # [[ crypto_sw == crypto_* ]] 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1096294 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 1096294 00:31:31.886 13:40:12 blockdev_crypto_sw -- common/autotest_common.sh@831 -- # '[' -z 1096294 ']' 00:31:31.886 13:40:12 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:31.886 13:40:12 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:31.886 13:40:12 blockdev_crypto_sw -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:31.886 13:40:12 blockdev_crypto_sw -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:31.886 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:31.886 13:40:12 blockdev_crypto_sw -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:31.886 13:40:12 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:31.886 [2024-07-25 13:40:12.587086] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:31.886 [2024-07-25 13:40:12.587151] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1096294 ] 00:31:31.886 [2024-07-25 13:40:12.677099] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:32.147 [2024-07-25 13:40:12.747192] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:32.718 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:32.718 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@864 -- # return 0 00:31:32.718 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:31:32.718 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@710 -- # setup_crypto_sw_conf 00:31:32.718 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@192 -- # rpc_cmd 00:31:32.718 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:32.718 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:32.977 Malloc0 00:31:32.977 Malloc1 00:31:32.977 true 00:31:32.977 true 00:31:32.977 true 00:31:32.977 [2024-07-25 13:40:13.635714] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:32.977 crypto_ram 00:31:32.977 [2024-07-25 13:40:13.643740] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:32.977 crypto_ram2 00:31:32.978 [2024-07-25 13:40:13.651760] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:32.978 crypto_ram3 00:31:32.978 [ 00:31:32.978 { 00:31:32.978 "name": "Malloc1", 00:31:32.978 "aliases": [ 00:31:32.978 "96888f2d-e261-43b7-9041-1ba452b3949c" 00:31:32.978 ], 00:31:32.978 "product_name": "Malloc disk", 00:31:32.978 "block_size": 4096, 00:31:32.978 "num_blocks": 4096, 00:31:32.978 "uuid": "96888f2d-e261-43b7-9041-1ba452b3949c", 00:31:32.978 "assigned_rate_limits": { 00:31:32.978 "rw_ios_per_sec": 0, 00:31:32.978 "rw_mbytes_per_sec": 0, 00:31:32.978 "r_mbytes_per_sec": 0, 00:31:32.978 "w_mbytes_per_sec": 0 00:31:32.978 }, 00:31:32.978 "claimed": true, 00:31:32.978 "claim_type": "exclusive_write", 00:31:32.978 "zoned": false, 00:31:32.978 "supported_io_types": { 00:31:32.978 "read": true, 00:31:32.978 "write": true, 00:31:32.978 "unmap": true, 00:31:32.978 "flush": true, 00:31:32.978 "reset": true, 00:31:32.978 "nvme_admin": false, 00:31:32.978 "nvme_io": false, 00:31:32.978 "nvme_io_md": false, 00:31:32.978 "write_zeroes": true, 00:31:32.978 "zcopy": true, 00:31:32.978 "get_zone_info": false, 00:31:32.978 "zone_management": false, 00:31:32.978 "zone_append": false, 00:31:32.978 "compare": false, 00:31:32.978 "compare_and_write": false, 00:31:32.978 "abort": true, 00:31:32.978 "seek_hole": false, 00:31:32.978 "seek_data": false, 00:31:32.978 "copy": true, 00:31:32.978 "nvme_iov_md": false 00:31:32.978 }, 00:31:32.978 "memory_domains": [ 00:31:32.978 { 00:31:32.978 "dma_device_id": "system", 00:31:32.978 "dma_device_type": 1 00:31:32.978 }, 00:31:32.978 { 00:31:32.978 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:32.978 "dma_device_type": 2 00:31:32.978 } 00:31:32.978 ], 00:31:32.978 "driver_specific": {} 00:31:32.978 } 00:31:32.978 ] 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # cat 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:31:32.978 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@561 -- # xtrace_disable 00:31:32.978 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:31:33.238 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:31:33.238 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r .name 00:31:33.238 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "744a6677-20ea-58fb-a9d4-e3830c764240"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "744a6677-20ea-58fb-a9d4-e3830c764240",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1cf7f30c-a07b-553f-8915-e7c6f8cfbd61"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1cf7f30c-a07b-553f-8915-e7c6f8cfbd61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:33.238 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:31:33.238 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:31:33.238 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:31:33.238 13:40:13 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # killprocess 1096294 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@950 -- # '[' -z 1096294 ']' 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # kill -0 1096294 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # uname 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1096294 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1096294' 00:31:33.238 killing process with pid 1096294 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@969 -- # kill 1096294 00:31:33.238 13:40:13 blockdev_crypto_sw -- common/autotest_common.sh@974 -- # wait 1096294 00:31:33.499 13:40:14 blockdev_crypto_sw -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:33.499 13:40:14 blockdev_crypto_sw -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:33.499 13:40:14 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:31:33.499 13:40:14 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:33.499 13:40:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:33.499 ************************************ 00:31:33.499 START TEST bdev_hello_world 00:31:33.499 ************************************ 00:31:33.499 13:40:14 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:33.499 [2024-07-25 13:40:14.201114] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:33.499 [2024-07-25 13:40:14.201161] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1096605 ] 00:31:33.499 [2024-07-25 13:40:14.290120] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:33.760 [2024-07-25 13:40:14.365196] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:33.760 [2024-07-25 13:40:14.504778] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:33.760 [2024-07-25 13:40:14.504827] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:33.760 [2024-07-25 13:40:14.504835] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.760 [2024-07-25 13:40:14.512796] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:33.760 [2024-07-25 13:40:14.512806] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:33.760 [2024-07-25 13:40:14.512812] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:33.760 [2024-07-25 13:40:14.520816] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:33.760 [2024-07-25 13:40:14.520826] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:33.760 [2024-07-25 13:40:14.520831] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:34.021 [2024-07-25 13:40:14.557570] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:34.021 [2024-07-25 13:40:14.557593] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:34.021 [2024-07-25 13:40:14.557603] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:34.021 [2024-07-25 13:40:14.558609] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:34.021 [2024-07-25 13:40:14.558660] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:34.021 [2024-07-25 13:40:14.558668] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:34.021 [2024-07-25 13:40:14.558692] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:34.021 00:31:34.021 [2024-07-25 13:40:14.558700] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:34.021 00:31:34.021 real 0m0.540s 00:31:34.021 user 0m0.374s 00:31:34.021 sys 0m0.153s 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:34.021 ************************************ 00:31:34.021 END TEST bdev_hello_world 00:31:34.021 ************************************ 00:31:34.021 13:40:14 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:31:34.021 13:40:14 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:34.021 13:40:14 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:34.021 13:40:14 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:34.021 ************************************ 00:31:34.021 START TEST bdev_bounds 00:31:34.021 ************************************ 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1096703 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1096703' 00:31:34.021 Process bdevio pid: 1096703 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1096703 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1096703 ']' 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:34.021 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:34.021 13:40:14 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:34.281 [2024-07-25 13:40:14.812760] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:34.281 [2024-07-25 13:40:14.812814] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1096703 ] 00:31:34.281 [2024-07-25 13:40:14.903608] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:34.281 [2024-07-25 13:40:14.973595] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:31:34.281 [2024-07-25 13:40:14.973801] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:31:34.281 [2024-07-25 13:40:14.973884] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:34.542 [2024-07-25 13:40:15.119815] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:34.542 [2024-07-25 13:40:15.119862] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:34.542 [2024-07-25 13:40:15.119870] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:34.542 [2024-07-25 13:40:15.127833] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:34.542 [2024-07-25 13:40:15.127844] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:34.542 [2024-07-25 13:40:15.127850] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:34.542 [2024-07-25 13:40:15.135856] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:34.542 [2024-07-25 13:40:15.135866] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:34.542 [2024-07-25 13:40:15.135872] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:35.112 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:35.112 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:31:35.112 13:40:15 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:35.112 I/O targets: 00:31:35.112 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:31:35.112 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:31:35.112 00:31:35.112 00:31:35.112 CUnit - A unit testing framework for C - Version 2.1-3 00:31:35.112 http://cunit.sourceforge.net/ 00:31:35.112 00:31:35.112 00:31:35.112 Suite: bdevio tests on: crypto_ram3 00:31:35.112 Test: blockdev write read block ...passed 00:31:35.112 Test: blockdev write zeroes read block ...passed 00:31:35.112 Test: blockdev write zeroes read no split ...passed 00:31:35.112 Test: blockdev write zeroes read split ...passed 00:31:35.112 Test: blockdev write zeroes read split partial ...passed 00:31:35.112 Test: blockdev reset ...passed 00:31:35.112 Test: blockdev write read 8 blocks ...passed 00:31:35.112 Test: blockdev write read size > 128k ...passed 00:31:35.112 Test: blockdev write read invalid size ...passed 00:31:35.112 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:35.112 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:35.113 Test: blockdev write read max offset ...passed 00:31:35.113 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:35.113 Test: blockdev writev readv 8 blocks ...passed 00:31:35.113 Test: blockdev writev readv 30 x 1block ...passed 00:31:35.113 Test: blockdev writev readv block ...passed 00:31:35.113 Test: blockdev writev readv size > 128k ...passed 00:31:35.113 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:35.113 Test: blockdev comparev and writev ...passed 00:31:35.113 Test: blockdev nvme passthru rw ...passed 00:31:35.113 Test: blockdev nvme passthru vendor specific ...passed 00:31:35.113 Test: blockdev nvme admin passthru ...passed 00:31:35.113 Test: blockdev copy ...passed 00:31:35.113 Suite: bdevio tests on: crypto_ram 00:31:35.113 Test: blockdev write read block ...passed 00:31:35.113 Test: blockdev write zeroes read block ...passed 00:31:35.113 Test: blockdev write zeroes read no split ...passed 00:31:35.113 Test: blockdev write zeroes read split ...passed 00:31:35.113 Test: blockdev write zeroes read split partial ...passed 00:31:35.113 Test: blockdev reset ...passed 00:31:35.113 Test: blockdev write read 8 blocks ...passed 00:31:35.113 Test: blockdev write read size > 128k ...passed 00:31:35.113 Test: blockdev write read invalid size ...passed 00:31:35.113 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:35.113 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:35.113 Test: blockdev write read max offset ...passed 00:31:35.113 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:35.113 Test: blockdev writev readv 8 blocks ...passed 00:31:35.113 Test: blockdev writev readv 30 x 1block ...passed 00:31:35.113 Test: blockdev writev readv block ...passed 00:31:35.113 Test: blockdev writev readv size > 128k ...passed 00:31:35.113 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:35.113 Test: blockdev comparev and writev ...passed 00:31:35.113 Test: blockdev nvme passthru rw ...passed 00:31:35.113 Test: blockdev nvme passthru vendor specific ...passed 00:31:35.113 Test: blockdev nvme admin passthru ...passed 00:31:35.113 Test: blockdev copy ...passed 00:31:35.113 00:31:35.113 Run Summary: Type Total Ran Passed Failed Inactive 00:31:35.113 suites 2 2 n/a 0 0 00:31:35.113 tests 46 46 46 0 0 00:31:35.113 asserts 260 260 260 0 n/a 00:31:35.113 00:31:35.113 Elapsed time = 0.152 seconds 00:31:35.113 0 00:31:35.113 13:40:15 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1096703 00:31:35.113 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1096703 ']' 00:31:35.113 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1096703 00:31:35.113 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:31:35.113 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:35.113 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1096703 00:31:35.374 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:35.374 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:35.374 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1096703' 00:31:35.374 killing process with pid 1096703 00:31:35.374 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1096703 00:31:35.374 13:40:15 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1096703 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:31:35.374 00:31:35.374 real 0m1.271s 00:31:35.374 user 0m3.451s 00:31:35.374 sys 0m0.271s 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:35.374 ************************************ 00:31:35.374 END TEST bdev_bounds 00:31:35.374 ************************************ 00:31:35.374 13:40:16 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:31:35.374 13:40:16 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:31:35.374 13:40:16 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:35.374 13:40:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:35.374 ************************************ 00:31:35.374 START TEST bdev_nbd 00:31:35.374 ************************************ 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=2 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=2 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1096966 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1096966 /var/tmp/spdk-nbd.sock 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1096966 ']' 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:31:35.374 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:31:35.374 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:35.374 [2024-07-25 13:40:16.163346] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:31:35.374 [2024-07-25 13:40:16.163395] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:35.634 [2024-07-25 13:40:16.252799] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:35.634 [2024-07-25 13:40:16.319585] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:31:35.895 [2024-07-25 13:40:16.463601] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:35.895 [2024-07-25 13:40:16.463644] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:35.895 [2024-07-25 13:40:16.463652] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:35.895 [2024-07-25 13:40:16.471619] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:35.895 [2024-07-25 13:40:16.471630] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:35.895 [2024-07-25 13:40:16.471636] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:35.895 [2024-07-25 13:40:16.479640] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:35.895 [2024-07-25 13:40:16.479649] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:35.895 [2024-07-25 13:40:16.479655] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:36.465 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:31:36.465 13:40:16 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:31:36.465 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:36.466 1+0 records in 00:31:36.466 1+0 records out 00:31:36.466 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000205435 s, 19.9 MB/s 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:36.466 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:36.726 1+0 records in 00:31:36.726 1+0 records out 00:31:36.726 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000252425 s, 16.2 MB/s 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:36.726 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:36.986 { 00:31:36.986 "nbd_device": "/dev/nbd0", 00:31:36.986 "bdev_name": "crypto_ram" 00:31:36.986 }, 00:31:36.986 { 00:31:36.986 "nbd_device": "/dev/nbd1", 00:31:36.986 "bdev_name": "crypto_ram3" 00:31:36.986 } 00:31:36.986 ]' 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:36.986 { 00:31:36.986 "nbd_device": "/dev/nbd0", 00:31:36.986 "bdev_name": "crypto_ram" 00:31:36.986 }, 00:31:36.986 { 00:31:36.986 "nbd_device": "/dev/nbd1", 00:31:36.986 "bdev_name": "crypto_ram3" 00:31:36.986 } 00:31:36.986 ]' 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:36.986 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:37.246 13:40:17 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:37.506 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:37.767 /dev/nbd0 00:31:37.767 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:38.028 1+0 records in 00:31:38.028 1+0 records out 00:31:38.028 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00021092 s, 19.4 MB/s 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:31:38.028 /dev/nbd1 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:38.028 1+0 records in 00:31:38.028 1+0 records out 00:31:38.028 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243236 s, 16.8 MB/s 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:38.028 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:38.289 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:38.289 { 00:31:38.289 "nbd_device": "/dev/nbd0", 00:31:38.289 "bdev_name": "crypto_ram" 00:31:38.289 }, 00:31:38.289 { 00:31:38.289 "nbd_device": "/dev/nbd1", 00:31:38.289 "bdev_name": "crypto_ram3" 00:31:38.289 } 00:31:38.289 ]' 00:31:38.289 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:38.289 { 00:31:38.289 "nbd_device": "/dev/nbd0", 00:31:38.289 "bdev_name": "crypto_ram" 00:31:38.289 }, 00:31:38.289 { 00:31:38.289 "nbd_device": "/dev/nbd1", 00:31:38.289 "bdev_name": "crypto_ram3" 00:31:38.289 } 00:31:38.289 ]' 00:31:38.289 13:40:18 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:38.289 /dev/nbd1' 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:38.289 /dev/nbd1' 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:38.289 256+0 records in 00:31:38.289 256+0 records out 00:31:38.289 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0125447 s, 83.6 MB/s 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:38.289 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:38.550 256+0 records in 00:31:38.550 256+0 records out 00:31:38.550 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.017407 s, 60.2 MB/s 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:38.550 256+0 records in 00:31:38.550 256+0 records out 00:31:38.550 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0354319 s, 29.6 MB/s 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:38.550 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:38.810 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:39.070 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:39.330 malloc_lvol_verify 00:31:39.330 13:40:19 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:39.330 d8af7079-cbcb-4455-9f4c-ead037e2b592 00:31:39.330 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:39.590 d0410276-2c8b-4a11-baeb-5e79ebb44490 00:31:39.590 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:39.850 /dev/nbd0 00:31:39.850 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:39.850 mke2fs 1.46.5 (30-Dec-2021) 00:31:39.850 Discarding device blocks: 0/4096 done 00:31:39.850 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:39.850 00:31:39.850 Allocating group tables: 0/1 done 00:31:39.850 Writing inode tables: 0/1 done 00:31:39.850 Creating journal (1024 blocks): done 00:31:39.850 Writing superblocks and filesystem accounting information: 0/1 done 00:31:39.850 00:31:39.850 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:39.851 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:39.851 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:39.851 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:39.851 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:39.851 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:39.851 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:39.851 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1096966 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1096966 ']' 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1096966 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1096966 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1096966' 00:31:40.111 killing process with pid 1096966 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1096966 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1096966 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:31:40.111 00:31:40.111 real 0m4.773s 00:31:40.111 user 0m7.173s 00:31:40.111 sys 0m1.412s 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:40.111 13:40:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:40.111 ************************************ 00:31:40.111 END TEST bdev_nbd 00:31:40.111 ************************************ 00:31:40.373 13:40:20 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:31:40.373 13:40:20 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = nvme ']' 00:31:40.373 13:40:20 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # '[' crypto_sw = gpt ']' 00:31:40.373 13:40:20 blockdev_crypto_sw -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:31:40.373 13:40:20 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:31:40.373 13:40:20 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:40.373 13:40:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:40.373 ************************************ 00:31:40.373 START TEST bdev_fio 00:31:40.373 ************************************ 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:40.373 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:31:40.373 13:40:20 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:40.373 ************************************ 00:31:40.373 START TEST bdev_fio_rw_verify 00:31:40.373 ************************************ 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:40.373 13:40:21 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:40.943 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:40.943 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:40.943 fio-3.35 00:31:40.943 Starting 2 threads 00:31:53.165 00:31:53.165 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1098172: Thu Jul 25 13:40:31 2024 00:31:53.165 read: IOPS=30.0k, BW=117MiB/s (123MB/s)(1174MiB/10001msec) 00:31:53.165 slat (usec): min=11, max=432, avg=13.84, stdev= 3.10 00:31:53.165 clat (usec): min=5, max=539, avg=104.80, stdev=41.00 00:31:53.165 lat (usec): min=17, max=551, avg=118.64, stdev=41.91 00:31:53.165 clat percentiles (usec): 00:31:53.165 | 50.000th=[ 103], 99.000th=[ 198], 99.900th=[ 225], 99.990th=[ 251], 00:31:53.165 | 99.999th=[ 297] 00:31:53.165 write: IOPS=36.1k, BW=141MiB/s (148MB/s)(1337MiB/9479msec); 0 zone resets 00:31:53.165 slat (usec): min=11, max=1374, avg=24.53, stdev= 4.62 00:31:53.165 clat (usec): min=20, max=1662, avg=143.19, stdev=65.53 00:31:53.165 lat (usec): min=41, max=1687, avg=167.72, stdev=66.88 00:31:53.165 clat percentiles (usec): 00:31:53.165 | 50.000th=[ 141], 99.000th=[ 285], 99.900th=[ 326], 99.990th=[ 619], 00:31:53.165 | 99.999th=[ 930] 00:31:53.165 bw ( KiB/s): min=127576, max=142952, per=94.73%, avg=136790.32, stdev=2239.14, samples=38 00:31:53.165 iops : min=31894, max=35738, avg=34197.58, stdev=559.79, samples=38 00:31:53.165 lat (usec) : 10=0.01%, 20=0.01%, 50=8.33%, 100=29.32%, 250=58.63% 00:31:53.165 lat (usec) : 500=3.71%, 750=0.01%, 1000=0.01% 00:31:53.165 lat (msec) : 2=0.01% 00:31:53.165 cpu : usr=99.69%, sys=0.00%, ctx=28, majf=0, minf=508 00:31:53.165 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:53.165 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:53.165 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:53.165 issued rwts: total=300476,342209,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:53.165 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:53.165 00:31:53.165 Run status group 0 (all jobs): 00:31:53.165 READ: bw=117MiB/s (123MB/s), 117MiB/s-117MiB/s (123MB/s-123MB/s), io=1174MiB (1231MB), run=10001-10001msec 00:31:53.165 WRITE: bw=141MiB/s (148MB/s), 141MiB/s-141MiB/s (148MB/s-148MB/s), io=1337MiB (1402MB), run=9479-9479msec 00:31:53.165 00:31:53.165 real 0m11.028s 00:31:53.165 user 0m26.693s 00:31:53.165 sys 0m0.321s 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:53.165 ************************************ 00:31:53.165 END TEST bdev_fio_rw_verify 00:31:53.165 ************************************ 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:31:53.165 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "744a6677-20ea-58fb-a9d4-e3830c764240"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "744a6677-20ea-58fb-a9d4-e3830c764240",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1cf7f30c-a07b-553f-8915-e7c6f8cfbd61"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1cf7f30c-a07b-553f-8915-e7c6f8cfbd61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:31:53.166 crypto_ram3 ]] 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "744a6677-20ea-58fb-a9d4-e3830c764240"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "744a6677-20ea-58fb-a9d4-e3830c764240",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "1cf7f30c-a07b-553f-8915-e7c6f8cfbd61"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "1cf7f30c-a07b-553f-8915-e7c6f8cfbd61",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:53.166 ************************************ 00:31:53.166 START TEST bdev_fio_trim 00:31:53.166 ************************************ 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:53.166 13:40:32 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:53.166 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:53.166 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:53.166 fio-3.35 00:31:53.166 Starting 2 threads 00:32:03.180 00:32:03.180 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=1100163: Thu Jul 25 13:40:43 2024 00:32:03.180 write: IOPS=56.3k, BW=220MiB/s (231MB/s)(2201MiB/10001msec); 0 zone resets 00:32:03.180 slat (usec): min=10, max=1789, avg=14.93, stdev= 4.99 00:32:03.180 clat (usec): min=24, max=2043, avg=118.30, stdev=65.46 00:32:03.180 lat (usec): min=40, max=2078, avg=133.24, stdev=67.94 00:32:03.180 clat percentiles (usec): 00:32:03.180 | 50.000th=[ 94], 99.000th=[ 251], 99.900th=[ 289], 99.990th=[ 490], 00:32:03.180 | 99.999th=[ 660] 00:32:03.180 bw ( KiB/s): min=203912, max=230344, per=99.96%, avg=225282.95, stdev=3726.94, samples=38 00:32:03.180 iops : min=50978, max=57586, avg=56320.74, stdev=931.73, samples=38 00:32:03.180 trim: IOPS=56.3k, BW=220MiB/s (231MB/s)(2201MiB/10001msec); 0 zone resets 00:32:03.180 slat (nsec): min=4733, max=85539, avg=6988.98, stdev=2383.95 00:32:03.180 clat (usec): min=34, max=1898, avg=79.04, stdev=23.97 00:32:03.180 lat (usec): min=39, max=1906, avg=86.03, stdev=24.07 00:32:03.180 clat percentiles (usec): 00:32:03.180 | 50.000th=[ 80], 99.000th=[ 135], 99.900th=[ 157], 99.990th=[ 277], 00:32:03.180 | 99.999th=[ 494] 00:32:03.180 bw ( KiB/s): min=203936, max=230344, per=99.96%, avg=225284.63, stdev=3725.14, samples=38 00:32:03.180 iops : min=50984, max=57586, avg=56321.16, stdev=931.28, samples=38 00:32:03.180 lat (usec) : 50=13.90%, 100=52.67%, 250=32.92%, 500=0.51%, 750=0.01% 00:32:03.180 lat (msec) : 2=0.01%, 4=0.01% 00:32:03.180 cpu : usr=99.70%, sys=0.00%, ctx=21, majf=0, minf=283 00:32:03.180 IO depths : 1=7.5%, 2=17.5%, 4=60.0%, 8=15.0%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:03.180 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:03.181 complete : 0=0.0%, 4=87.0%, 8=13.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:03.181 issued rwts: total=0,563499,563500,0 short=0,0,0,0 dropped=0,0,0,0 00:32:03.181 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:03.181 00:32:03.181 Run status group 0 (all jobs): 00:32:03.181 WRITE: bw=220MiB/s (231MB/s), 220MiB/s-220MiB/s (231MB/s-231MB/s), io=2201MiB (2308MB), run=10001-10001msec 00:32:03.181 TRIM: bw=220MiB/s (231MB/s), 220MiB/s-220MiB/s (231MB/s-231MB/s), io=2201MiB (2308MB), run=10001-10001msec 00:32:03.181 00:32:03.181 real 0m11.036s 00:32:03.181 user 0m27.297s 00:32:03.181 sys 0m0.351s 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:03.181 ************************************ 00:32:03.181 END TEST bdev_fio_trim 00:32:03.181 ************************************ 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:32:03.181 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:32:03.181 00:32:03.181 real 0m22.416s 00:32:03.181 user 0m54.179s 00:32:03.181 sys 0m0.850s 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:03.181 ************************************ 00:32:03.181 END TEST bdev_fio 00:32:03.181 ************************************ 00:32:03.181 13:40:43 blockdev_crypto_sw -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:03.181 13:40:43 blockdev_crypto_sw -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:03.181 13:40:43 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:32:03.181 13:40:43 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:03.181 13:40:43 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:03.181 ************************************ 00:32:03.181 START TEST bdev_verify 00:32:03.181 ************************************ 00:32:03.181 13:40:43 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:03.181 [2024-07-25 13:40:43.503521] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:03.181 [2024-07-25 13:40:43.503589] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1101954 ] 00:32:03.181 [2024-07-25 13:40:43.595864] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:03.181 [2024-07-25 13:40:43.673538] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:03.181 [2024-07-25 13:40:43.673542] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:03.181 [2024-07-25 13:40:43.815214] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:03.181 [2024-07-25 13:40:43.815272] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:03.181 [2024-07-25 13:40:43.815280] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.181 [2024-07-25 13:40:43.823233] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:03.181 [2024-07-25 13:40:43.823243] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:03.181 [2024-07-25 13:40:43.823249] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.181 [2024-07-25 13:40:43.831256] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:03.181 [2024-07-25 13:40:43.831265] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:03.181 [2024-07-25 13:40:43.831271] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:03.181 Running I/O for 5 seconds... 00:32:08.465 00:32:08.465 Latency(us) 00:32:08.465 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:08.465 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:08.465 Verification LBA range: start 0x0 length 0x800 00:32:08.465 crypto_ram : 5.01 7463.49 29.15 0.00 0.00 17080.27 1165.78 22786.36 00:32:08.465 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:08.465 Verification LBA range: start 0x800 length 0x800 00:32:08.465 crypto_ram : 5.01 5080.49 19.85 0.00 0.00 25080.28 1814.84 29844.09 00:32:08.465 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:32:08.465 Verification LBA range: start 0x0 length 0x800 00:32:08.465 crypto_ram3 : 5.02 3748.87 14.64 0.00 0.00 33985.36 1569.08 25811.10 00:32:08.465 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:32:08.465 Verification LBA range: start 0x800 length 0x800 00:32:08.465 crypto_ram3 : 5.02 2548.99 9.96 0.00 0.00 49908.00 2003.89 34482.02 00:32:08.465 =================================================================================================================== 00:32:08.465 Total : 18841.83 73.60 0.00 0.00 27052.93 1165.78 34482.02 00:32:08.465 00:32:08.465 real 0m5.593s 00:32:08.465 user 0m10.684s 00:32:08.465 sys 0m0.160s 00:32:08.465 13:40:49 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:08.465 13:40:49 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:32:08.465 ************************************ 00:32:08.465 END TEST bdev_verify 00:32:08.465 ************************************ 00:32:08.465 13:40:49 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:08.465 13:40:49 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:32:08.465 13:40:49 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:08.465 13:40:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:08.465 ************************************ 00:32:08.465 START TEST bdev_verify_big_io 00:32:08.465 ************************************ 00:32:08.465 13:40:49 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:32:08.465 [2024-07-25 13:40:49.173021] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:08.465 [2024-07-25 13:40:49.173074] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1102869 ] 00:32:08.726 [2024-07-25 13:40:49.264969] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:08.726 [2024-07-25 13:40:49.328164] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:08.726 [2024-07-25 13:40:49.328168] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:08.726 [2024-07-25 13:40:49.472236] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:08.726 [2024-07-25 13:40:49.472286] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:08.726 [2024-07-25 13:40:49.472294] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:08.726 [2024-07-25 13:40:49.480254] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:08.726 [2024-07-25 13:40:49.480265] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:08.726 [2024-07-25 13:40:49.480271] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:08.726 [2024-07-25 13:40:49.488273] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:08.726 [2024-07-25 13:40:49.488283] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:08.726 [2024-07-25 13:40:49.488288] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:08.985 Running I/O for 5 seconds... 00:32:14.264 00:32:14.264 Latency(us) 00:32:14.264 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:14.264 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:14.264 Verification LBA range: start 0x0 length 0x80 00:32:14.264 crypto_ram : 5.03 483.64 30.23 0.00 0.00 257952.19 3831.34 383940.14 00:32:14.264 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:14.264 Verification LBA range: start 0x80 length 0x80 00:32:14.264 crypto_ram : 5.12 349.84 21.87 0.00 0.00 354664.04 4637.93 490410.93 00:32:14.264 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:32:14.264 Verification LBA range: start 0x0 length 0x80 00:32:14.264 crypto_ram3 : 5.37 286.04 17.88 0.00 0.00 422021.47 3453.24 392006.10 00:32:14.264 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:32:14.264 Verification LBA range: start 0x80 length 0x80 00:32:14.264 crypto_ram3 : 5.37 190.52 11.91 0.00 0.00 621175.46 4663.14 500090.09 00:32:14.264 =================================================================================================================== 00:32:14.264 Total : 1310.05 81.88 0.00 0.00 375472.63 3453.24 500090.09 00:32:14.524 00:32:14.524 real 0m5.938s 00:32:14.524 user 0m11.389s 00:32:14.524 sys 0m0.161s 00:32:14.524 13:40:55 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:14.524 13:40:55 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:32:14.524 ************************************ 00:32:14.524 END TEST bdev_verify_big_io 00:32:14.524 ************************************ 00:32:14.524 13:40:55 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:14.524 13:40:55 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:32:14.524 13:40:55 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:14.524 13:40:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:14.524 ************************************ 00:32:14.524 START TEST bdev_write_zeroes 00:32:14.524 ************************************ 00:32:14.524 13:40:55 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:14.524 [2024-07-25 13:40:55.190152] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:14.524 [2024-07-25 13:40:55.190201] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1103865 ] 00:32:14.524 [2024-07-25 13:40:55.279680] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:14.783 [2024-07-25 13:40:55.356545] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:14.783 [2024-07-25 13:40:55.497356] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:14.783 [2024-07-25 13:40:55.497405] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:14.783 [2024-07-25 13:40:55.497413] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:14.783 [2024-07-25 13:40:55.505375] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:32:14.783 [2024-07-25 13:40:55.505386] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:14.783 [2024-07-25 13:40:55.505392] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:14.783 [2024-07-25 13:40:55.513395] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:32:14.783 [2024-07-25 13:40:55.513404] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:32:14.783 [2024-07-25 13:40:55.513410] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:14.783 Running I/O for 1 seconds... 00:32:16.163 00:32:16.163 Latency(us) 00:32:16.163 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:16.163 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:16.163 crypto_ram : 1.01 32775.89 128.03 0.00 0.00 3896.33 1008.25 5444.53 00:32:16.163 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:32:16.163 crypto_ram3 : 1.01 16359.66 63.90 0.00 0.00 7775.86 4864.79 8015.56 00:32:16.163 =================================================================================================================== 00:32:16.163 Total : 49135.55 191.94 0.00 0.00 5189.51 1008.25 8015.56 00:32:16.163 00:32:16.163 real 0m1.555s 00:32:16.163 user 0m1.393s 00:32:16.163 sys 0m0.145s 00:32:16.163 13:40:56 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:16.163 13:40:56 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:32:16.163 ************************************ 00:32:16.163 END TEST bdev_write_zeroes 00:32:16.163 ************************************ 00:32:16.163 13:40:56 blockdev_crypto_sw -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:16.163 13:40:56 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:32:16.163 13:40:56 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:16.163 13:40:56 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:16.163 ************************************ 00:32:16.163 START TEST bdev_json_nonenclosed 00:32:16.163 ************************************ 00:32:16.163 13:40:56 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:16.163 [2024-07-25 13:40:56.826188] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:16.163 [2024-07-25 13:40:56.826233] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1104186 ] 00:32:16.163 [2024-07-25 13:40:56.914257] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:16.423 [2024-07-25 13:40:56.987487] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:16.423 [2024-07-25 13:40:56.987541] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:32:16.423 [2024-07-25 13:40:56.987557] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:16.423 [2024-07-25 13:40:56.987564] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:16.423 00:32:16.423 real 0m0.274s 00:32:16.423 user 0m0.166s 00:32:16.423 sys 0m0.106s 00:32:16.423 13:40:57 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:16.423 13:40:57 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:32:16.423 ************************************ 00:32:16.423 END TEST bdev_json_nonenclosed 00:32:16.423 ************************************ 00:32:16.423 13:40:57 blockdev_crypto_sw -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:16.423 13:40:57 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:32:16.423 13:40:57 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:16.423 13:40:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:16.423 ************************************ 00:32:16.423 START TEST bdev_json_nonarray 00:32:16.423 ************************************ 00:32:16.423 13:40:57 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:32:16.423 [2024-07-25 13:40:57.175786] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:16.423 [2024-07-25 13:40:57.175832] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1104208 ] 00:32:16.683 [2024-07-25 13:40:57.261404] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:16.683 [2024-07-25 13:40:57.326363] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:16.683 [2024-07-25 13:40:57.326418] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:32:16.683 [2024-07-25 13:40:57.326428] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:32:16.683 [2024-07-25 13:40:57.326435] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:32:16.683 00:32:16.683 real 0m0.259s 00:32:16.683 user 0m0.161s 00:32:16.683 sys 0m0.096s 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:32:16.683 ************************************ 00:32:16.683 END TEST bdev_json_nonarray 00:32:16.683 ************************************ 00:32:16.683 13:40:57 blockdev_crypto_sw -- bdev/blockdev.sh@786 -- # [[ crypto_sw == bdev ]] 00:32:16.683 13:40:57 blockdev_crypto_sw -- bdev/blockdev.sh@793 -- # [[ crypto_sw == gpt ]] 00:32:16.683 13:40:57 blockdev_crypto_sw -- bdev/blockdev.sh@797 -- # [[ crypto_sw == crypto_sw ]] 00:32:16.683 13:40:57 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:32:16.683 13:40:57 blockdev_crypto_sw -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:32:16.683 13:40:57 blockdev_crypto_sw -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:16.683 13:40:57 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:16.683 ************************************ 00:32:16.683 START TEST bdev_crypto_enomem 00:32:16.683 ************************************ 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # bdev_crypto_enomem 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@634 -- # local base_dev=base0 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local test_dev=crypt0 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local err_dev=EE_base0 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local qd=32 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # ERR_PID=1104239 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # waitforlisten 1104239 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@831 -- # '[' -z 1104239 ']' 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:16.683 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:16.683 13:40:57 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:16.943 [2024-07-25 13:40:57.519739] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:16.943 [2024-07-25 13:40:57.519798] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1104239 ] 00:32:16.943 [2024-07-25 13:40:57.610264] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:16.943 [2024-07-25 13:40:57.719196] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@864 -- # return 0 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@644 -- # rpc_cmd 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:17.885 true 00:32:17.885 base0 00:32:17.885 true 00:32:17.885 [2024-07-25 13:40:58.405851] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:32:17.885 crypt0 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@651 -- # waitforbdev crypt0 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_name=crypt0 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local bdev_timeout= 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # local i 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # [[ -z '' ]] 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@902 -- # bdev_timeout=2000 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@904 -- # rpc_cmd bdev_wait_for_examine 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:17.885 [ 00:32:17.885 { 00:32:17.885 "name": "crypt0", 00:32:17.885 "aliases": [ 00:32:17.885 "fb2f0eac-28ef-597c-a7bc-4026f448cf57" 00:32:17.885 ], 00:32:17.885 "product_name": "crypto", 00:32:17.885 "block_size": 512, 00:32:17.885 "num_blocks": 2097152, 00:32:17.885 "uuid": "fb2f0eac-28ef-597c-a7bc-4026f448cf57", 00:32:17.885 "assigned_rate_limits": { 00:32:17.885 "rw_ios_per_sec": 0, 00:32:17.885 "rw_mbytes_per_sec": 0, 00:32:17.885 "r_mbytes_per_sec": 0, 00:32:17.885 "w_mbytes_per_sec": 0 00:32:17.885 }, 00:32:17.885 "claimed": false, 00:32:17.885 "zoned": false, 00:32:17.885 "supported_io_types": { 00:32:17.885 "read": true, 00:32:17.885 "write": true, 00:32:17.885 "unmap": false, 00:32:17.885 "flush": false, 00:32:17.885 "reset": true, 00:32:17.885 "nvme_admin": false, 00:32:17.885 "nvme_io": false, 00:32:17.885 "nvme_io_md": false, 00:32:17.885 "write_zeroes": true, 00:32:17.885 "zcopy": false, 00:32:17.885 "get_zone_info": false, 00:32:17.885 "zone_management": false, 00:32:17.885 "zone_append": false, 00:32:17.885 "compare": false, 00:32:17.885 "compare_and_write": false, 00:32:17.885 "abort": false, 00:32:17.885 "seek_hole": false, 00:32:17.885 "seek_data": false, 00:32:17.885 "copy": false, 00:32:17.885 "nvme_iov_md": false 00:32:17.885 }, 00:32:17.885 "memory_domains": [ 00:32:17.885 { 00:32:17.885 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:17.885 "dma_device_type": 2 00:32:17.885 } 00:32:17.885 ], 00:32:17.885 "driver_specific": { 00:32:17.885 "crypto": { 00:32:17.885 "base_bdev_name": "EE_base0", 00:32:17.885 "name": "crypt0", 00:32:17.885 "key_name": "test_dek_sw" 00:32:17.885 } 00:32:17.885 } 00:32:17.885 } 00:32:17.885 ] 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@907 -- # return 0 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # rpcpid=1104431 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@656 -- # sleep 1 00:32:17.885 13:40:58 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@653 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:32:17.885 Running I/O for 5 seconds... 00:32:18.827 13:40:59 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:32:18.827 13:40:59 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:18.827 13:40:59 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:18.827 13:40:59 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:18.827 13:40:59 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@659 -- # wait 1104431 00:32:23.031 00:32:23.031 Latency(us) 00:32:23.031 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:23.031 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:32:23.031 crypt0 : 5.00 28893.88 112.87 0.00 0.00 1102.40 538.78 1953.48 00:32:23.031 =================================================================================================================== 00:32:23.031 Total : 28893.88 112.87 0.00 0.00 1102.40 538.78 1953.48 00:32:23.031 0 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@661 -- # rpc_cmd bdev_crypto_delete crypt0 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@663 -- # killprocess 1104239 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@950 -- # '[' -z 1104239 ']' 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # kill -0 1104239 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # uname 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1104239 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1104239' 00:32:23.031 killing process with pid 1104239 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@969 -- # kill 1104239 00:32:23.031 Received shutdown signal, test time was about 5.000000 seconds 00:32:23.031 00:32:23.031 Latency(us) 00:32:23.031 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:23.031 =================================================================================================================== 00:32:23.031 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@974 -- # wait 1104239 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # trap - SIGINT SIGTERM EXIT 00:32:23.031 00:32:23.031 real 0m6.348s 00:32:23.031 user 0m6.570s 00:32:23.031 sys 0m0.309s 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:23.031 13:41:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:23.031 ************************************ 00:32:23.031 END TEST bdev_crypto_enomem 00:32:23.031 ************************************ 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # cleanup 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:32:23.293 13:41:03 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:32:23.293 00:32:23.293 real 0m51.456s 00:32:23.293 user 1m37.709s 00:32:23.293 sys 0m4.639s 00:32:23.293 13:41:03 blockdev_crypto_sw -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:23.293 13:41:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:23.293 ************************************ 00:32:23.293 END TEST blockdev_crypto_sw 00:32:23.293 ************************************ 00:32:23.293 13:41:03 -- spdk/autotest.sh@363 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:23.293 13:41:03 -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:32:23.293 13:41:03 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:23.293 13:41:03 -- common/autotest_common.sh@10 -- # set +x 00:32:23.293 ************************************ 00:32:23.293 START TEST blockdev_crypto_qat 00:32:23.293 ************************************ 00:32:23.293 13:41:03 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:23.293 * Looking for test storage... 00:32:23.293 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@669 -- # QOS_DEV_1=Malloc_0 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_2=Null_1 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_RUN_TIME=5 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # uname -s 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@673 -- # '[' Linux = Linux ']' 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@675 -- # PRE_RESERVED_MEM=0 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@681 -- # test_type=crypto_qat 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # crypto_device= 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # dek= 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # env_ctx= 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # wait_for_rpc= 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # '[' -n '' ']' 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == bdev ]] 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@689 -- # [[ crypto_qat == crypto_* ]] 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # wait_for_rpc=--wait-for-rpc 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@692 -- # start_spdk_tgt 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=1105342 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 1105342 00:32:23.293 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@831 -- # '[' -z 1105342 ']' 00:32:23.293 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:23.293 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:23.293 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:23.293 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:23.293 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:23.293 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:23.293 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:23.555 [2024-07-25 13:41:04.133674] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:23.555 [2024-07-25 13:41:04.133744] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1105342 ] 00:32:23.555 [2024-07-25 13:41:04.225052] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:23.555 [2024-07-25 13:41:04.320348] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:24.498 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:24.498 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@864 -- # return 0 00:32:24.498 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # case "$test_type" in 00:32:24.498 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@707 -- # setup_crypto_qat_conf 00:32:24.498 13:41:04 blockdev_crypto_qat -- bdev/blockdev.sh@169 -- # rpc_cmd 00:32:24.498 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:24.498 13:41:04 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:24.498 [2024-07-25 13:41:04.990367] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:24.498 [2024-07-25 13:41:04.998398] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:24.498 [2024-07-25 13:41:05.006414] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:24.498 [2024-07-25 13:41:05.070034] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:27.043 true 00:32:27.043 true 00:32:27.043 true 00:32:27.043 true 00:32:27.043 Malloc0 00:32:27.043 Malloc1 00:32:27.043 Malloc2 00:32:27.043 Malloc3 00:32:27.043 [2024-07-25 13:41:07.483115] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:27.043 crypto_ram 00:32:27.043 [2024-07-25 13:41:07.491135] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:27.043 crypto_ram1 00:32:27.043 [2024-07-25 13:41:07.499159] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:27.043 crypto_ram2 00:32:27.043 [2024-07-25 13:41:07.507178] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:27.043 crypto_ram3 00:32:27.043 [ 00:32:27.043 { 00:32:27.043 "name": "Malloc1", 00:32:27.043 "aliases": [ 00:32:27.043 "fdcf40bc-ecf6-4b23-90ec-5b022c8cda68" 00:32:27.043 ], 00:32:27.043 "product_name": "Malloc disk", 00:32:27.043 "block_size": 512, 00:32:27.043 "num_blocks": 65536, 00:32:27.043 "uuid": "fdcf40bc-ecf6-4b23-90ec-5b022c8cda68", 00:32:27.043 "assigned_rate_limits": { 00:32:27.043 "rw_ios_per_sec": 0, 00:32:27.043 "rw_mbytes_per_sec": 0, 00:32:27.043 "r_mbytes_per_sec": 0, 00:32:27.043 "w_mbytes_per_sec": 0 00:32:27.043 }, 00:32:27.044 "claimed": true, 00:32:27.044 "claim_type": "exclusive_write", 00:32:27.044 "zoned": false, 00:32:27.044 "supported_io_types": { 00:32:27.044 "read": true, 00:32:27.044 "write": true, 00:32:27.044 "unmap": true, 00:32:27.044 "flush": true, 00:32:27.044 "reset": true, 00:32:27.044 "nvme_admin": false, 00:32:27.044 "nvme_io": false, 00:32:27.044 "nvme_io_md": false, 00:32:27.044 "write_zeroes": true, 00:32:27.044 "zcopy": true, 00:32:27.044 "get_zone_info": false, 00:32:27.044 "zone_management": false, 00:32:27.044 "zone_append": false, 00:32:27.044 "compare": false, 00:32:27.044 "compare_and_write": false, 00:32:27.044 "abort": true, 00:32:27.044 "seek_hole": false, 00:32:27.044 "seek_data": false, 00:32:27.044 "copy": true, 00:32:27.044 "nvme_iov_md": false 00:32:27.044 }, 00:32:27.044 "memory_domains": [ 00:32:27.044 { 00:32:27.044 "dma_device_id": "system", 00:32:27.044 "dma_device_type": 1 00:32:27.044 }, 00:32:27.044 { 00:32:27.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:27.044 "dma_device_type": 2 00:32:27.044 } 00:32:27.044 ], 00:32:27.044 "driver_specific": {} 00:32:27.044 } 00:32:27.044 ] 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@736 -- # rpc_cmd bdev_wait_for_examine 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # cat 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n accel 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n bdev 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@739 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # mapfile -t bdevs 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # rpc_cmd bdev_get_bdevs 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@747 -- # jq -r '.[] | select(.claimed == false)' 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@561 -- # xtrace_disable 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs_name 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r .name 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "bcefee04-e466-5382-9c50-75b4e9f3b9eb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bcefee04-e466-5382-9c50-75b4e9f3b9eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9f1c373a-0c00-55e4-993c-dbdc2c95d2fb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9f1c373a-0c00-55e4-993c-dbdc2c95d2fb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "a7f7f865-fe35-5ded-830d-1ed3ff594dc0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a7f7f865-fe35-5ded-830d-1ed3ff594dc0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "0c5f90a4-e0b0-5eb3-ba90-d92209d876cf"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "0c5f90a4-e0b0-5eb3-ba90-d92209d876cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # bdev_list=("${bdevs_name[@]}") 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@751 -- # hello_world_bdev=crypto_ram 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # trap - SIGINT SIGTERM EXIT 00:32:27.044 13:41:07 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # killprocess 1105342 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@950 -- # '[' -z 1105342 ']' 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # kill -0 1105342 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # uname 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1105342 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1105342' 00:32:27.044 killing process with pid 1105342 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@969 -- # kill 1105342 00:32:27.044 13:41:07 blockdev_crypto_qat -- common/autotest_common.sh@974 -- # wait 1105342 00:32:27.615 13:41:08 blockdev_crypto_qat -- bdev/blockdev.sh@757 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:27.615 13:41:08 blockdev_crypto_qat -- bdev/blockdev.sh@759 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:27.615 13:41:08 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 7 -le 1 ']' 00:32:27.615 13:41:08 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:27.615 13:41:08 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:27.616 ************************************ 00:32:27.616 START TEST bdev_hello_world 00:32:27.616 ************************************ 00:32:27.616 13:41:08 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:27.616 [2024-07-25 13:41:08.238381] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:27.616 [2024-07-25 13:41:08.238436] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1106089 ] 00:32:27.616 [2024-07-25 13:41:08.328592] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:27.876 [2024-07-25 13:41:08.423173] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:27.876 [2024-07-25 13:41:08.444332] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:27.876 [2024-07-25 13:41:08.452354] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:27.876 [2024-07-25 13:41:08.460370] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:27.876 [2024-07-25 13:41:08.558564] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:30.417 [2024-07-25 13:41:10.790072] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:30.417 [2024-07-25 13:41:10.790116] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:30.417 [2024-07-25 13:41:10.790124] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.417 [2024-07-25 13:41:10.798090] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:30.417 [2024-07-25 13:41:10.798101] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:30.417 [2024-07-25 13:41:10.798106] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.417 [2024-07-25 13:41:10.806110] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:30.417 [2024-07-25 13:41:10.806120] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:30.417 [2024-07-25 13:41:10.806125] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.417 [2024-07-25 13:41:10.814130] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:30.417 [2024-07-25 13:41:10.814139] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:30.417 [2024-07-25 13:41:10.814144] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:30.417 [2024-07-25 13:41:10.875618] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:30.417 [2024-07-25 13:41:10.875648] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:30.417 [2024-07-25 13:41:10.875659] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:30.417 [2024-07-25 13:41:10.876683] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:30.417 [2024-07-25 13:41:10.876734] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:30.417 [2024-07-25 13:41:10.876743] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:30.417 [2024-07-25 13:41:10.876775] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:30.417 00:32:30.417 [2024-07-25 13:41:10.876785] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:30.417 00:32:30.417 real 0m2.922s 00:32:30.417 user 0m2.564s 00:32:30.417 sys 0m0.321s 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:30.417 ************************************ 00:32:30.417 END TEST bdev_hello_world 00:32:30.417 ************************************ 00:32:30.417 13:41:11 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_bounds bdev_bounds '' 00:32:30.417 13:41:11 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:32:30.417 13:41:11 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:30.417 13:41:11 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:30.417 ************************************ 00:32:30.417 START TEST bdev_bounds 00:32:30.417 ************************************ 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # bdev_bounds '' 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # bdevio_pid=1106478 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # echo 'Process bdevio pid: 1106478' 00:32:30.417 Process bdevio pid: 1106478 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@288 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # waitforlisten 1106478 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@831 -- # '[' -z 1106478 ']' 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:30.417 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:30.417 13:41:11 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:30.678 [2024-07-25 13:41:11.237211] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:30.678 [2024-07-25 13:41:11.237256] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1106478 ] 00:32:30.678 [2024-07-25 13:41:11.325720] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:30.678 [2024-07-25 13:41:11.390415] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:32:30.678 [2024-07-25 13:41:11.390576] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 2 00:32:30.678 [2024-07-25 13:41:11.390595] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:30.678 [2024-07-25 13:41:11.411753] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:30.678 [2024-07-25 13:41:11.419779] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:30.678 [2024-07-25 13:41:11.427799] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:30.938 [2024-07-25 13:41:11.512905] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:33.477 [2024-07-25 13:41:13.665732] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:33.477 [2024-07-25 13:41:13.665786] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:33.477 [2024-07-25 13:41:13.665795] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.477 [2024-07-25 13:41:13.673749] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:33.477 [2024-07-25 13:41:13.673761] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:33.477 [2024-07-25 13:41:13.673767] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.477 [2024-07-25 13:41:13.681770] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:33.477 [2024-07-25 13:41:13.681784] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:33.477 [2024-07-25 13:41:13.681790] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.477 [2024-07-25 13:41:13.689789] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:33.477 [2024-07-25 13:41:13.689800] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:33.477 [2024-07-25 13:41:13.689805] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:33.477 13:41:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:33.477 13:41:13 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@864 -- # return 0 00:32:33.477 13:41:13 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:33.477 I/O targets: 00:32:33.477 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:33.477 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:32:33.477 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:32:33.477 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:33.477 00:32:33.477 00:32:33.477 CUnit - A unit testing framework for C - Version 2.1-3 00:32:33.477 http://cunit.sourceforge.net/ 00:32:33.477 00:32:33.477 00:32:33.477 Suite: bdevio tests on: crypto_ram3 00:32:33.477 Test: blockdev write read block ...passed 00:32:33.477 Test: blockdev write zeroes read block ...passed 00:32:33.477 Test: blockdev write zeroes read no split ...passed 00:32:33.477 Test: blockdev write zeroes read split ...passed 00:32:33.477 Test: blockdev write zeroes read split partial ...passed 00:32:33.477 Test: blockdev reset ...passed 00:32:33.477 Test: blockdev write read 8 blocks ...passed 00:32:33.477 Test: blockdev write read size > 128k ...passed 00:32:33.477 Test: blockdev write read invalid size ...passed 00:32:33.477 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:33.477 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:33.477 Test: blockdev write read max offset ...passed 00:32:33.477 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:33.477 Test: blockdev writev readv 8 blocks ...passed 00:32:33.477 Test: blockdev writev readv 30 x 1block ...passed 00:32:33.477 Test: blockdev writev readv block ...passed 00:32:33.477 Test: blockdev writev readv size > 128k ...passed 00:32:33.477 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:33.477 Test: blockdev comparev and writev ...passed 00:32:33.477 Test: blockdev nvme passthru rw ...passed 00:32:33.477 Test: blockdev nvme passthru vendor specific ...passed 00:32:33.477 Test: blockdev nvme admin passthru ...passed 00:32:33.477 Test: blockdev copy ...passed 00:32:33.477 Suite: bdevio tests on: crypto_ram2 00:32:33.477 Test: blockdev write read block ...passed 00:32:33.477 Test: blockdev write zeroes read block ...passed 00:32:33.477 Test: blockdev write zeroes read no split ...passed 00:32:33.477 Test: blockdev write zeroes read split ...passed 00:32:33.477 Test: blockdev write zeroes read split partial ...passed 00:32:33.477 Test: blockdev reset ...passed 00:32:33.477 Test: blockdev write read 8 blocks ...passed 00:32:33.477 Test: blockdev write read size > 128k ...passed 00:32:33.477 Test: blockdev write read invalid size ...passed 00:32:33.477 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:33.477 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:33.477 Test: blockdev write read max offset ...passed 00:32:33.477 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:33.477 Test: blockdev writev readv 8 blocks ...passed 00:32:33.477 Test: blockdev writev readv 30 x 1block ...passed 00:32:33.477 Test: blockdev writev readv block ...passed 00:32:33.477 Test: blockdev writev readv size > 128k ...passed 00:32:33.477 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:33.477 Test: blockdev comparev and writev ...passed 00:32:33.477 Test: blockdev nvme passthru rw ...passed 00:32:33.477 Test: blockdev nvme passthru vendor specific ...passed 00:32:33.477 Test: blockdev nvme admin passthru ...passed 00:32:33.477 Test: blockdev copy ...passed 00:32:33.477 Suite: bdevio tests on: crypto_ram1 00:32:33.477 Test: blockdev write read block ...passed 00:32:33.477 Test: blockdev write zeroes read block ...passed 00:32:33.477 Test: blockdev write zeroes read no split ...passed 00:32:33.477 Test: blockdev write zeroes read split ...passed 00:32:33.737 Test: blockdev write zeroes read split partial ...passed 00:32:33.737 Test: blockdev reset ...passed 00:32:33.737 Test: blockdev write read 8 blocks ...passed 00:32:33.737 Test: blockdev write read size > 128k ...passed 00:32:33.737 Test: blockdev write read invalid size ...passed 00:32:33.737 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:33.737 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:33.737 Test: blockdev write read max offset ...passed 00:32:33.737 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:33.737 Test: blockdev writev readv 8 blocks ...passed 00:32:33.737 Test: blockdev writev readv 30 x 1block ...passed 00:32:33.737 Test: blockdev writev readv block ...passed 00:32:33.737 Test: blockdev writev readv size > 128k ...passed 00:32:33.737 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:33.737 Test: blockdev comparev and writev ...passed 00:32:33.737 Test: blockdev nvme passthru rw ...passed 00:32:33.737 Test: blockdev nvme passthru vendor specific ...passed 00:32:33.737 Test: blockdev nvme admin passthru ...passed 00:32:33.737 Test: blockdev copy ...passed 00:32:33.737 Suite: bdevio tests on: crypto_ram 00:32:33.737 Test: blockdev write read block ...passed 00:32:33.737 Test: blockdev write zeroes read block ...passed 00:32:33.737 Test: blockdev write zeroes read no split ...passed 00:32:33.997 Test: blockdev write zeroes read split ...passed 00:32:33.997 Test: blockdev write zeroes read split partial ...passed 00:32:33.997 Test: blockdev reset ...passed 00:32:33.997 Test: blockdev write read 8 blocks ...passed 00:32:33.997 Test: blockdev write read size > 128k ...passed 00:32:33.997 Test: blockdev write read invalid size ...passed 00:32:33.997 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:33.997 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:33.997 Test: blockdev write read max offset ...passed 00:32:33.997 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:33.997 Test: blockdev writev readv 8 blocks ...passed 00:32:33.997 Test: blockdev writev readv 30 x 1block ...passed 00:32:33.997 Test: blockdev writev readv block ...passed 00:32:33.997 Test: blockdev writev readv size > 128k ...passed 00:32:34.260 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:34.260 Test: blockdev comparev and writev ...passed 00:32:34.260 Test: blockdev nvme passthru rw ...passed 00:32:34.260 Test: blockdev nvme passthru vendor specific ...passed 00:32:34.260 Test: blockdev nvme admin passthru ...passed 00:32:34.260 Test: blockdev copy ...passed 00:32:34.260 00:32:34.260 Run Summary: Type Total Ran Passed Failed Inactive 00:32:34.260 suites 4 4 n/a 0 0 00:32:34.260 tests 92 92 92 0 0 00:32:34.260 asserts 520 520 520 0 n/a 00:32:34.260 00:32:34.260 Elapsed time = 1.852 seconds 00:32:34.260 0 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # killprocess 1106478 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@950 -- # '[' -z 1106478 ']' 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # kill -0 1106478 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # uname 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1106478 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1106478' 00:32:34.260 killing process with pid 1106478 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@969 -- # kill 1106478 00:32:34.260 13:41:14 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@974 -- # wait 1106478 00:32:34.586 13:41:15 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # trap - SIGINT SIGTERM EXIT 00:32:34.586 00:32:34.586 real 0m3.918s 00:32:34.587 user 0m10.578s 00:32:34.587 sys 0m0.414s 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:34.587 ************************************ 00:32:34.587 END TEST bdev_bounds 00:32:34.587 ************************************ 00:32:34.587 13:41:15 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:32:34.587 13:41:15 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 5 -le 1 ']' 00:32:34.587 13:41:15 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:34.587 13:41:15 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:34.587 ************************************ 00:32:34.587 START TEST bdev_nbd 00:32:34.587 ************************************ 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # uname -s 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@299 -- # [[ Linux == Linux ]] 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@301 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local bdev_all 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_num=4 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@308 -- # [[ -e /sys/module/nbd ]] 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@310 -- # local nbd_all 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # bdev_num=4 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@313 -- # local nbd_list 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local bdev_list 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # nbd_pid=1107131 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # waitforlisten 1107131 /var/tmp/spdk-nbd.sock 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@316 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@831 -- # '[' -z 1107131 ']' 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@836 -- # local max_retries=100 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:34.587 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@840 -- # xtrace_disable 00:32:34.587 13:41:15 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:34.587 [2024-07-25 13:41:15.237219] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:32:34.587 [2024-07-25 13:41:15.237272] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:34.587 [2024-07-25 13:41:15.331733] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:34.869 [2024-07-25 13:41:15.409651] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:32:34.869 [2024-07-25 13:41:15.430730] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:34.869 [2024-07-25 13:41:15.438755] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:34.869 [2024-07-25 13:41:15.446772] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:34.869 [2024-07-25 13:41:15.532333] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:37.409 [2024-07-25 13:41:17.687785] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:37.409 [2024-07-25 13:41:17.687835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:37.409 [2024-07-25 13:41:17.687844] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.409 [2024-07-25 13:41:17.695802] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:37.409 [2024-07-25 13:41:17.695814] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:37.409 [2024-07-25 13:41:17.695820] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.409 [2024-07-25 13:41:17.703822] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:37.409 [2024-07-25 13:41:17.703835] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:37.409 [2024-07-25 13:41:17.703840] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.409 [2024-07-25 13:41:17.711842] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:37.409 [2024-07-25 13:41:17.711854] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:37.409 [2024-07-25 13:41:17.711860] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@864 -- # return 0 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@321 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:37.409 13:41:17 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:37.409 1+0 records in 00:32:37.409 1+0 records out 00:32:37.409 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000257044 s, 15.9 MB/s 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:37.409 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:37.669 1+0 records in 00:32:37.669 1+0 records out 00:32:37.669 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256228 s, 16.0 MB/s 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:37.669 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd2 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd2 /proc/partitions 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:37.928 1+0 records in 00:32:37.928 1+0 records out 00:32:37.928 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000301123 s, 13.6 MB/s 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:37.928 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:38.188 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:38.188 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:38.188 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd3 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd3 /proc/partitions 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:38.189 1+0 records in 00:32:38.189 1+0 records out 00:32:38.189 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000279336 s, 14.7 MB/s 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd0", 00:32:38.189 "bdev_name": "crypto_ram" 00:32:38.189 }, 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd1", 00:32:38.189 "bdev_name": "crypto_ram1" 00:32:38.189 }, 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd2", 00:32:38.189 "bdev_name": "crypto_ram2" 00:32:38.189 }, 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd3", 00:32:38.189 "bdev_name": "crypto_ram3" 00:32:38.189 } 00:32:38.189 ]' 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd0", 00:32:38.189 "bdev_name": "crypto_ram" 00:32:38.189 }, 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd1", 00:32:38.189 "bdev_name": "crypto_ram1" 00:32:38.189 }, 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd2", 00:32:38.189 "bdev_name": "crypto_ram2" 00:32:38.189 }, 00:32:38.189 { 00:32:38.189 "nbd_device": "/dev/nbd3", 00:32:38.189 "bdev_name": "crypto_ram3" 00:32:38.189 } 00:32:38.189 ]' 00:32:38.189 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:38.449 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:38.449 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:38.449 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:38.449 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:38.449 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:38.449 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:38.449 13:41:18 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:38.449 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:38.709 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:38.969 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:39.229 13:41:19 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:39.490 /dev/nbd0 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:39.490 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd0 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd0 /proc/partitions 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:39.751 1+0 records in 00:32:39.751 1+0 records out 00:32:39.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271703 s, 15.1 MB/s 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:32:39.751 /dev/nbd1 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd1 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd1 /proc/partitions 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:39.751 1+0 records in 00:32:39.751 1+0 records out 00:32:39.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000256475 s, 16.0 MB/s 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:39.751 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:32:40.011 /dev/nbd10 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd10 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd10 /proc/partitions 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:40.011 1+0 records in 00:32:40.011 1+0 records out 00:32:40.011 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000310882 s, 13.2 MB/s 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:40.011 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:32:40.271 /dev/nbd11 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local nbd_name=nbd11 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@869 -- # local i 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i = 1 )) 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # (( i <= 20 )) 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # grep -q -w nbd11 /proc/partitions 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@873 -- # break 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i = 1 )) 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # (( i <= 20 )) 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:40.271 1+0 records in 00:32:40.271 1+0 records out 00:32:40.271 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262248 s, 15.6 MB/s 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # size=4096 00:32:40.271 13:41:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:40.271 13:41:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # '[' 4096 '!=' 0 ']' 00:32:40.271 13:41:21 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@889 -- # return 0 00:32:40.271 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:40.271 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:40.271 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:40.271 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:40.271 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd0", 00:32:40.531 "bdev_name": "crypto_ram" 00:32:40.531 }, 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd1", 00:32:40.531 "bdev_name": "crypto_ram1" 00:32:40.531 }, 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd10", 00:32:40.531 "bdev_name": "crypto_ram2" 00:32:40.531 }, 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd11", 00:32:40.531 "bdev_name": "crypto_ram3" 00:32:40.531 } 00:32:40.531 ]' 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd0", 00:32:40.531 "bdev_name": "crypto_ram" 00:32:40.531 }, 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd1", 00:32:40.531 "bdev_name": "crypto_ram1" 00:32:40.531 }, 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd10", 00:32:40.531 "bdev_name": "crypto_ram2" 00:32:40.531 }, 00:32:40.531 { 00:32:40.531 "nbd_device": "/dev/nbd11", 00:32:40.531 "bdev_name": "crypto_ram3" 00:32:40.531 } 00:32:40.531 ]' 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:40.531 /dev/nbd1 00:32:40.531 /dev/nbd10 00:32:40.531 /dev/nbd11' 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:40.531 /dev/nbd1 00:32:40.531 /dev/nbd10 00:32:40.531 /dev/nbd11' 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:40.531 256+0 records in 00:32:40.531 256+0 records out 00:32:40.531 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0119402 s, 87.8 MB/s 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:40.531 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:40.791 256+0 records in 00:32:40.791 256+0 records out 00:32:40.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0599476 s, 17.5 MB/s 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:40.791 256+0 records in 00:32:40.791 256+0 records out 00:32:40.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0519584 s, 20.2 MB/s 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:40.791 256+0 records in 00:32:40.791 256+0 records out 00:32:40.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.030434 s, 34.5 MB/s 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:40.791 256+0 records in 00:32:40.791 256+0 records out 00:32:40.791 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0285164 s, 36.8 MB/s 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:40.791 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:41.051 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:41.311 13:41:21 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:41.571 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:41.831 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:41.831 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:41.831 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:42.091 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:42.092 13:41:22 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:42.662 malloc_lvol_verify 00:32:42.662 13:41:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:43.230 3ddf28df-633f-4c2f-927e-de48b9cd7b86 00:32:43.230 13:41:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:43.800 73ba7d1c-dba1-4ab5-a2ce-fd31caba9952 00:32:43.800 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:44.370 /dev/nbd0 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:44.370 mke2fs 1.46.5 (30-Dec-2021) 00:32:44.370 Discarding device blocks: 0/4096 done 00:32:44.370 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:44.370 00:32:44.370 Allocating group tables: 0/1 done 00:32:44.370 Writing inode tables: 0/1 done 00:32:44.370 Creating journal (1024 blocks): done 00:32:44.370 Writing superblocks and filesystem accounting information: 0/1 done 00:32:44.370 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:44.370 13:41:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:44.370 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:44.370 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:44.370 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:44.370 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:44.370 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:44.370 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@325 -- # killprocess 1107131 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@950 -- # '[' -z 1107131 ']' 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # kill -0 1107131 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # uname 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:32:44.371 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1107131 00:32:44.631 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:32:44.631 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:32:44.631 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1107131' 00:32:44.631 killing process with pid 1107131 00:32:44.631 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@969 -- # kill 1107131 00:32:44.631 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@974 -- # wait 1107131 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # trap - SIGINT SIGTERM EXIT 00:32:44.892 00:32:44.892 real 0m10.262s 00:32:44.892 user 0m15.007s 00:32:44.892 sys 0m2.559s 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1126 -- # xtrace_disable 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:44.892 ************************************ 00:32:44.892 END TEST bdev_nbd 00:32:44.892 ************************************ 00:32:44.892 13:41:25 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # [[ y == y ]] 00:32:44.892 13:41:25 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = nvme ']' 00:32:44.892 13:41:25 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # '[' crypto_qat = gpt ']' 00:32:44.892 13:41:25 blockdev_crypto_qat -- bdev/blockdev.sh@767 -- # run_test bdev_fio fio_test_suite '' 00:32:44.892 13:41:25 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 3 -le 1 ']' 00:32:44.892 13:41:25 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:44.892 13:41:25 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:44.892 ************************************ 00:32:44.892 START TEST bdev_fio 00:32:44.892 ************************************ 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # fio_test_suite '' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@330 -- # local env_context 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@334 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:44.892 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # echo '' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # sed s/--env-context=// 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@338 -- # env_context= 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=verify 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type=AIO 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z verify ']' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' verify == verify ']' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1314 -- # cat 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # '[' AIO == AIO ']' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # /usr/src/fio/fio --version 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1325 -- # echo serialize_overlap=1 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram]' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram1]' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram1 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram2]' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram2 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # for b in "${bdevs_name[@]}" 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # echo '[job_crypto_ram3]' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo filename=crypto_ram3 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@346 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@348 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:44.892 ************************************ 00:32:44.892 START TEST bdev_fio_rw_verify 00:32:44.892 ************************************ 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local sanitizers 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1341 -- # shift 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # local asan_lib= 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libasan 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:32:44.892 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:32:45.174 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # asan_lib= 00:32:45.174 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:32:45.174 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:45.174 13:41:25 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:45.435 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:45.435 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:45.435 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:45.435 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:45.435 fio-3.35 00:32:45.435 Starting 4 threads 00:33:00.343 00:33:00.343 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1109700: Thu Jul 25 13:41:38 2024 00:33:00.343 read: IOPS=34.2k, BW=134MiB/s (140MB/s)(1335MiB/10001msec) 00:33:00.343 slat (usec): min=14, max=364, avg=38.18, stdev=25.46 00:33:00.343 clat (usec): min=19, max=1292, avg=229.78, stdev=160.99 00:33:00.343 lat (usec): min=34, max=1406, avg=267.96, stdev=174.90 00:33:00.343 clat percentiles (usec): 00:33:00.343 | 50.000th=[ 176], 99.000th=[ 783], 99.900th=[ 996], 99.990th=[ 1172], 00:33:00.343 | 99.999th=[ 1287] 00:33:00.343 write: IOPS=37.5k, BW=147MiB/s (154MB/s)(1428MiB/9749msec); 0 zone resets 00:33:00.343 slat (usec): min=15, max=720, avg=48.06, stdev=25.46 00:33:00.343 clat (usec): min=16, max=1676, avg=262.28, stdev=164.18 00:33:00.343 lat (usec): min=43, max=1769, avg=310.34, stdev=178.16 00:33:00.343 clat percentiles (usec): 00:33:00.343 | 50.000th=[ 221], 99.000th=[ 807], 99.900th=[ 1045], 99.990th=[ 1270], 00:33:00.343 | 99.999th=[ 1500] 00:33:00.343 bw ( KiB/s): min=130560, max=162888, per=97.43%, avg=146172.11, stdev=2519.47, samples=76 00:33:00.343 iops : min=32640, max=40722, avg=36543.00, stdev=629.87, samples=76 00:33:00.343 lat (usec) : 20=0.01%, 50=0.03%, 100=13.50%, 250=49.50%, 500=28.50% 00:33:00.343 lat (usec) : 750=6.95%, 1000=1.40% 00:33:00.343 lat (msec) : 2=0.13% 00:33:00.343 cpu : usr=99.70%, sys=0.00%, ctx=51, majf=0, minf=215 00:33:00.343 IO depths : 1=0.2%, 2=28.6%, 4=57.0%, 8=14.2%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:00.343 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:00.343 complete : 0=0.0%, 4=87.5%, 8=12.5%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:00.343 issued rwts: total=341817,365654,0,0 short=0,0,0,0 dropped=0,0,0,0 00:33:00.343 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:00.343 00:33:00.343 Run status group 0 (all jobs): 00:33:00.343 READ: bw=134MiB/s (140MB/s), 134MiB/s-134MiB/s (140MB/s-140MB/s), io=1335MiB (1400MB), run=10001-10001msec 00:33:00.343 WRITE: bw=147MiB/s (154MB/s), 147MiB/s-147MiB/s (154MB/s-154MB/s), io=1428MiB (1498MB), run=9749-9749msec 00:33:00.343 00:33:00.343 real 0m13.505s 00:33:00.343 user 0m49.945s 00:33:00.343 sys 0m0.504s 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:33:00.343 ************************************ 00:33:00.343 END TEST bdev_fio_rw_verify 00:33:00.343 ************************************ 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # rm -f 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@353 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local workload=trim 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local bdev_type= 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local env_context= 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1284 -- # local fio_dir=/usr/src/fio 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1286 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:33:00.343 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1291 -- # '[' -z trim ']' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1295 -- # '[' -n '' ']' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1299 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1301 -- # cat 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # '[' trim == verify ']' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # '[' trim == trim ']' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1329 -- # echo rw=trimwrite 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "bcefee04-e466-5382-9c50-75b4e9f3b9eb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bcefee04-e466-5382-9c50-75b4e9f3b9eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9f1c373a-0c00-55e4-993c-dbdc2c95d2fb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9f1c373a-0c00-55e4-993c-dbdc2c95d2fb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "a7f7f865-fe35-5ded-830d-1ed3ff594dc0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a7f7f865-fe35-5ded-830d-1ed3ff594dc0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "0c5f90a4-e0b0-5eb3-ba90-d92209d876cf"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "0c5f90a4-e0b0-5eb3-ba90-d92209d876cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # [[ -n crypto_ram 00:33:00.344 crypto_ram1 00:33:00.344 crypto_ram2 00:33:00.344 crypto_ram3 ]] 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "bcefee04-e466-5382-9c50-75b4e9f3b9eb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "bcefee04-e466-5382-9c50-75b4e9f3b9eb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "9f1c373a-0c00-55e4-993c-dbdc2c95d2fb"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "9f1c373a-0c00-55e4-993c-dbdc2c95d2fb",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "a7f7f865-fe35-5ded-830d-1ed3ff594dc0"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "a7f7f865-fe35-5ded-830d-1ed3ff594dc0",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "0c5f90a4-e0b0-5eb3-ba90-d92209d876cf"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "0c5f90a4-e0b0-5eb3-ba90-d92209d876cf",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "flush": true,' ' "reset": true,' ' "nvme_admin": false,' ' "nvme_io": false,' ' "nvme_io_md": false,' ' "write_zeroes": true,' ' "zcopy": false,' ' "get_zone_info": false,' ' "zone_management": false,' ' "zone_append": false,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "seek_hole": false,' ' "seek_data": false,' ' "copy": false,' ' "nvme_iov_md": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram]' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram1]' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram1 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram2]' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram2 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # echo '[job_crypto_ram3]' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo filename=crypto_ram3 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@366 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1101 -- # '[' 11 -le 1 ']' 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:00.344 ************************************ 00:33:00.344 START TEST bdev_fio_trim 00:33:00.344 ************************************ 00:33:00.344 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1356 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1337 -- # local fio_dir=/usr/src/fio 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local sanitizers 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1341 -- # shift 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # local asan_lib= 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libasan 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # for sanitizer in "${sanitizers[@]}" 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # grep libclang_rt.asan 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # awk '{print $3}' 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # asan_lib= 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1346 -- # [[ -n '' ]] 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:33:00.345 13:41:39 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1352 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:33:00.345 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.345 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.345 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.345 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:33:00.345 fio-3.35 00:33:00.345 Starting 4 threads 00:33:12.566 00:33:12.566 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=1112126: Thu Jul 25 13:41:52 2024 00:33:12.566 write: IOPS=60.2k, BW=235MiB/s (247MB/s)(2352MiB/10001msec); 0 zone resets 00:33:12.566 slat (usec): min=14, max=547, avg=41.04, stdev=25.81 00:33:12.566 clat (usec): min=29, max=1323, avg=140.19, stdev=85.02 00:33:12.566 lat (usec): min=45, max=1494, avg=181.23, stdev=99.45 00:33:12.566 clat percentiles (usec): 00:33:12.566 | 50.000th=[ 124], 99.000th=[ 429], 99.900th=[ 750], 99.990th=[ 840], 00:33:12.566 | 99.999th=[ 1037] 00:33:12.566 bw ( KiB/s): min=151872, max=263424, per=99.64%, avg=240008.21, stdev=9446.62, samples=76 00:33:12.566 iops : min=37968, max=65856, avg=60002.05, stdev=2361.66, samples=76 00:33:12.566 trim: IOPS=60.2k, BW=235MiB/s (247MB/s)(2352MiB/10001msec); 0 zone resets 00:33:12.566 slat (usec): min=3, max=185, avg= 8.38, stdev= 4.51 00:33:12.566 clat (usec): min=45, max=1494, avg=181.39, stdev=99.46 00:33:12.566 lat (usec): min=51, max=1516, avg=189.78, stdev=100.73 00:33:12.566 clat percentiles (usec): 00:33:12.566 | 50.000th=[ 159], 99.000th=[ 537], 99.900th=[ 914], 99.990th=[ 1012], 00:33:12.567 | 99.999th=[ 1287] 00:33:12.567 bw ( KiB/s): min=151872, max=263424, per=99.64%, avg=240008.21, stdev=9446.62, samples=76 00:33:12.567 iops : min=37968, max=65856, avg=60002.05, stdev=2361.66, samples=76 00:33:12.567 lat (usec) : 50=4.52%, 100=22.87%, 250=58.29%, 500=13.39%, 750=0.75% 00:33:12.567 lat (usec) : 1000=0.17% 00:33:12.567 lat (msec) : 2=0.01% 00:33:12.567 cpu : usr=99.69%, sys=0.00%, ctx=42, majf=0, minf=90 00:33:12.567 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:33:12.567 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:12.567 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:33:12.567 issued rwts: total=0,602236,602236,0 short=0,0,0,0 dropped=0,0,0,0 00:33:12.567 latency : target=0, window=0, percentile=100.00%, depth=8 00:33:12.567 00:33:12.567 Run status group 0 (all jobs): 00:33:12.567 WRITE: bw=235MiB/s (247MB/s), 235MiB/s-235MiB/s (247MB/s-247MB/s), io=2352MiB (2467MB), run=10001-10001msec 00:33:12.567 TRIM: bw=235MiB/s (247MB/s), 235MiB/s-235MiB/s (247MB/s-247MB/s), io=2352MiB (2467MB), run=10001-10001msec 00:33:12.567 00:33:12.567 real 0m13.464s 00:33:12.567 user 0m49.518s 00:33:12.567 sys 0m0.492s 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:33:12.567 ************************************ 00:33:12.567 END TEST bdev_fio_trim 00:33:12.567 ************************************ 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # rm -f 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # popd 00:33:12.567 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # trap - SIGINT SIGTERM EXIT 00:33:12.567 00:33:12.567 real 0m27.324s 00:33:12.567 user 1m39.659s 00:33:12.567 sys 0m1.172s 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:33:12.567 ************************************ 00:33:12.567 END TEST bdev_fio 00:33:12.567 ************************************ 00:33:12.567 13:41:52 blockdev_crypto_qat -- bdev/blockdev.sh@774 -- # trap cleanup SIGINT SIGTERM EXIT 00:33:12.567 13:41:52 blockdev_crypto_qat -- bdev/blockdev.sh@776 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:12.567 13:41:52 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:33:12.567 13:41:52 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:12.567 13:41:52 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:12.567 ************************************ 00:33:12.567 START TEST bdev_verify 00:33:12.567 ************************************ 00:33:12.567 13:41:52 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:33:12.567 [2024-07-25 13:41:52.981086] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:12.567 [2024-07-25 13:41:52.981140] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1113749 ] 00:33:12.567 [2024-07-25 13:41:53.071001] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:12.567 [2024-07-25 13:41:53.166657] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:12.567 [2024-07-25 13:41:53.166718] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:12.567 [2024-07-25 13:41:53.187965] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:12.567 [2024-07-25 13:41:53.195990] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:12.567 [2024-07-25 13:41:53.204015] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:12.567 [2024-07-25 13:41:53.302656] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:15.113 [2024-07-25 13:41:55.561982] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:15.113 [2024-07-25 13:41:55.562071] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:15.113 [2024-07-25 13:41:55.562081] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.113 [2024-07-25 13:41:55.569997] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:15.113 [2024-07-25 13:41:55.570011] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:15.113 [2024-07-25 13:41:55.570017] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.113 [2024-07-25 13:41:55.578019] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:15.113 [2024-07-25 13:41:55.578031] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:15.113 [2024-07-25 13:41:55.578037] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.113 [2024-07-25 13:41:55.586040] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:15.113 [2024-07-25 13:41:55.586053] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:15.113 [2024-07-25 13:41:55.586059] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:15.113 Running I/O for 5 seconds... 00:33:20.478 00:33:20.479 Latency(us) 00:33:20.479 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:20.479 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x0 length 0x1000 00:33:20.479 crypto_ram : 5.06 601.12 2.35 0.00 0.00 212238.22 2697.06 130668.70 00:33:20.479 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x1000 length 0x1000 00:33:20.479 crypto_ram : 5.05 406.76 1.59 0.00 0.00 311025.05 230.01 174224.94 00:33:20.479 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x0 length 0x1000 00:33:20.479 crypto_ram1 : 5.06 602.57 2.35 0.00 0.00 211243.71 2999.53 120989.54 00:33:20.479 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x1000 length 0x1000 00:33:20.479 crypto_ram1 : 5.05 405.36 1.58 0.00 0.00 314996.10 10334.52 191163.47 00:33:20.479 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x0 length 0x1000 00:33:20.479 crypto_ram2 : 5.05 4692.75 18.33 0.00 0.00 27060.02 4285.05 24399.56 00:33:20.479 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x1000 length 0x1000 00:33:20.479 crypto_ram2 : 5.04 3147.51 12.29 0.00 0.00 40390.98 6856.07 41136.44 00:33:20.479 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x0 length 0x1000 00:33:20.479 crypto_ram3 : 5.05 4691.57 18.33 0.00 0.00 27013.87 4133.81 24399.56 00:33:20.479 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:20.479 Verification LBA range: start 0x1000 length 0x1000 00:33:20.479 crypto_ram3 : 5.04 3146.20 12.29 0.00 0.00 40287.75 6604.01 32263.88 00:33:20.479 =================================================================================================================== 00:33:20.479 Total : 17693.83 69.12 0.00 0.00 57501.54 230.01 191163.47 00:33:20.479 00:33:20.479 real 0m8.080s 00:33:20.479 user 0m15.425s 00:33:20.479 sys 0m0.342s 00:33:20.479 13:42:00 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:20.479 13:42:00 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:20.479 ************************************ 00:33:20.479 END TEST bdev_verify 00:33:20.479 ************************************ 00:33:20.479 13:42:01 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:20.479 13:42:01 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 16 -le 1 ']' 00:33:20.479 13:42:01 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:20.479 13:42:01 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:20.479 ************************************ 00:33:20.479 START TEST bdev_verify_big_io 00:33:20.479 ************************************ 00:33:20.479 13:42:01 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:20.479 [2024-07-25 13:42:01.128648] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:20.479 [2024-07-25 13:42:01.128696] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1115106 ] 00:33:20.479 [2024-07-25 13:42:01.220351] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:20.739 [2024-07-25 13:42:01.297467] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:20.739 [2024-07-25 13:42:01.297471] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:20.739 [2024-07-25 13:42:01.318619] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:20.739 [2024-07-25 13:42:01.326647] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:20.739 [2024-07-25 13:42:01.334668] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:20.739 [2024-07-25 13:42:01.423313] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:23.280 [2024-07-25 13:42:03.585639] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:23.280 [2024-07-25 13:42:03.585699] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:23.280 [2024-07-25 13:42:03.585707] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:23.280 [2024-07-25 13:42:03.593657] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:23.280 [2024-07-25 13:42:03.593667] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:23.280 [2024-07-25 13:42:03.593673] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:23.280 [2024-07-25 13:42:03.601678] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:23.280 [2024-07-25 13:42:03.601688] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:23.280 [2024-07-25 13:42:03.601693] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:23.280 [2024-07-25 13:42:03.609698] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:23.280 [2024-07-25 13:42:03.609707] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:23.280 [2024-07-25 13:42:03.609713] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:23.280 Running I/O for 5 seconds... 00:33:23.853 [2024-07-25 13:42:04.478260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.478752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.478819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.478869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.478915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.478959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.479486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.479498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.483494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.483545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.483597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.483642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.484208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.484254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.484299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.484345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.484882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.484895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.488583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.488632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.488676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.488733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.489331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.489378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.489424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.489475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.489960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.489973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.493622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.493671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.493716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.493760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.494312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.494359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.494415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.494460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.495016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.495031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.498622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.498672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.498735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.498792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.499392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.853 [2024-07-25 13:42:04.499438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.499482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.499527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.500037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.500050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.503878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.504332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.504344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.507480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.507528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.507579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.507626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.508005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.508052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.508096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.508140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.508472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.508484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.511794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.511843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.511887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.511932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.512302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.512348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.512393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.512437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.512827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.512840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.515953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.516984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.520848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.521197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.521208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.523831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.523879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.523923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.523968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.524537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.524588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.524644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.524689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.525273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.525285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.528799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.529352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.529368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.532294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.532343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.532387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.532434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.532881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.532929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.532974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.533019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.533354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.533366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.536643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.536693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.536740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.536784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.537212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.854 [2024-07-25 13:42:04.537258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.537303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.537350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.537779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.537791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.540810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.540859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.540903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.540948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.541378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.541429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.541474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.541518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.541855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.541867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.544881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.544930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.544975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.545019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.545582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.545630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.545675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.545720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.546068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.546080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.548611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.548659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.548704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.548751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.549295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.549343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.549389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.549433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.550013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.550026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.552727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.552776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.552820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.552865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.553233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.553280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.553324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.553368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.553921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.553934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.556832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.556880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.556925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.556970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.557406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.557452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.557497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.557540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.557877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.557889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.561764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.562210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.562222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.565919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.566249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.566261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.568828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.568883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.568928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.568973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.569501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.569552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.569597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.569642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.570148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.570160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.572639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.572687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.572731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.572775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.573144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.573190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.573235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.855 [2024-07-25 13:42:04.573291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.573866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.573879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.576676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.576725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.576770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.576814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.577253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.577300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.577344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.577388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.577723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.577735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.581664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.582154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.582166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.584965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.585013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.585058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.585103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.585606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.585653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.585697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.585741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.586151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.586163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.588503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.588556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.588601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.588645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.589194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.589243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.589288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.589342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.589939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.589951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.592399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.592447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.592510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.592559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.592929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.592975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.593020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.593067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.593590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.593602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.596274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.596322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.596366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.596410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.596921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.596968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.597012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.597056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.597392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.597404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.600887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.601218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.601230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.603782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.603829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.603878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.603923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.604526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.604579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.604624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.604669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.605022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.605034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.607218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.607265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.607309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.607341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.607864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.607911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.607955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.608000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.608508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.856 [2024-07-25 13:42:04.608520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.613149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.614883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.616712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.618535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.619453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.619919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.621063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.622704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.623038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.623050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.625742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.626205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.627450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.629091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.631251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.632468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.634222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.636148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.636484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.636496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:23.857 [2024-07-25 13:42:04.641486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.643405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.645266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.646488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.648683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.650513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.651859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.652329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.652844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.652857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.656767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.658587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.660412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.661412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.662403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.663166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.664810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.666654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.666988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.666999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.669856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.670319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.672252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.673972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.676223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.677485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.679129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.680955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.681288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.681299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.685936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.687754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.689259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.691152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.693371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.695221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.695927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.696389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.696962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.696975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.701208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.703164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.119 [2024-07-25 13:42:04.705125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.705589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.706541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.707975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.709608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.711387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.711867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.711893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.714884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.715360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.715830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.716292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.717259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.717726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.718187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.718656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.719139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.719151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.722529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.722996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.723457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.723923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.724954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.725421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.725888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.726349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.726848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.726860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.730317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.730786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.731260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.731725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.732693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.733156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.733622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.734083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.734581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.734593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.737881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.738344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.738809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.739271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.740290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.740761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.741223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.741687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.742205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.742218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.745617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.746080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.746560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.747024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.747994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.748459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.748926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.749391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.749901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.749913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.753253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.753722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.754183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.754648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.755704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.756167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.756644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.757105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.757611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.757625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.760945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.761409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.761880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.762341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.763284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.763758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.764223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.764688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.765189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.765202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.768458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.768926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.769387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.769855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.770910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.771375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.771840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.772303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.772822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.772835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.776114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.776588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.777056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.120 [2024-07-25 13:42:04.777519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.778492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.778960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.779422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.779890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.780458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.780471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.784581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.785576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.786039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.786500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.788564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.790403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.792239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.793768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.794105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.794117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.797149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.798850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.800769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.802740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.804640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.806287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.808116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.809938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.810423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.810435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.814802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.816380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.818214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.819862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.822040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.822541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.823010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.823471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.823826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.823838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.827638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.828184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.828656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.829118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.831316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.833166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.835018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.836257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.836635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.836648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.840378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.842028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.843864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.845690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.848052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.849698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.851520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.853347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.853852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.853864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.858135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.859687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.861533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.863177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.865297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.865851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.866312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.866777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.867116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.867128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.870907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.871754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.872215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.872691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.874710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.876538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.878368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.879595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.879955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.879967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.883303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.884941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.886770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.888596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.890721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.892363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.894174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.896012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.896472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.896485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.900708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.902296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.904132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.905772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.121 [2024-07-25 13:42:04.907940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.908449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.908917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.909382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.909732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.909744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.913498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.914442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.914908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.915379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.917426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.919249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.385 [2024-07-25 13:42:04.921062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.922363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.922796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.922809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.925804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.927550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.929509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.931470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.933286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.934933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.936748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.938569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.939033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.939045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.943448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.945314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.946550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.948185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.950342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.951487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.951954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.952420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.952849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.952862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.956809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.958654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.959211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.959676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.961505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.963150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.964992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.966823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.967320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.967332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.969926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.970818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.972442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.974247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.976119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.977967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.979612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.981438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.981778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.981791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.986213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.988041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.989778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.989817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.991926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.993768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.995597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.996155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.996662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:04.996674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.000235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.001876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.003690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.005512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.005564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.006047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.006515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.006981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.007973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.009619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.009955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.009967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.011929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.011978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.012022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.012075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.012637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.012688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.012735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.012779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.012824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.013347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.013359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.015504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.015558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.015602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.015645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.016053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.016104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.016149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.016193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.016237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.016571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.386 [2024-07-25 13:42:05.016584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.019995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.020388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.020400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.022512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.022564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.022609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.022654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.023213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.023263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.023308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.023352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.023399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.023781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.023795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.025903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.025952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.025996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.026040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.026370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.026423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.026468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.026513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.026561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.027101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.027114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.029468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.029517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.029566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.029615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.030080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.030129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.030175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.030219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.030262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.030628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.030640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.033740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.034070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.034082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.036875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.037405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.037418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.039565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.039613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.039657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.039701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.040073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.040127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.040179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.040224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.040268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.040602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.040615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.043480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.043528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.043579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.043625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.043959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.044008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.044052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.044096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.044142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.044494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.044507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.046595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.046644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.387 [2024-07-25 13:42:05.046689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.046733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.047251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.047301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.047347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.047392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.047437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.047861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.047874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.049995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.050655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.051213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.051225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.053514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.053567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.053613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.053658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.054053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.054107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.054155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.054199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.054242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.054600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.054613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.057741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.058071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.058083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.060894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.061384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.061396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.063614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.063662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.063710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.063757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.064091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.064140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.064184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.064228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.064272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.064828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.064841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.067405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.067453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.067498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.067543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.068006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.068061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.068106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.068149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.068194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.068686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.068701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.071414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.071463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.071510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.071559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.072085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.072136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.072181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.072225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.072272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.072867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.072879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.075557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.075605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.075650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.075694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.076206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.076256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.388 [2024-07-25 13:42:05.076300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.076344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.076388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.076846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.076858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.079531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.079583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.079636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.079681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.080171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.080221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.080266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.080310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.080354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.080844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.080857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.083715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.083763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.083808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.083852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.084404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.084458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.084503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.084552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.084597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.085103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.085116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.087959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.088826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.089310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.089326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.092891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.093401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.093413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.096998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.097553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.097566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.100323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.100372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.100416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.100461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.100973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.101037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.101082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.101127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.101175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.101702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.101715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.104469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.104522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.104570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.104615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.105142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.105192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.105236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.105281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.105334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.105893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.105905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.108497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.108550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.108595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.108639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.109125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.109177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.109222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.109279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.109335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.109887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.389 [2024-07-25 13:42:05.109900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.112529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.112581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.112632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.112677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.113141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.113194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.113239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.113283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.113329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.113841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.113853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.116563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.116612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.116661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.116714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.117263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.117319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.117364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.117408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.117460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.118036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.118049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.120717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.120769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.120813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.120859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.121399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.121449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.121495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.121540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.121589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.122054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.122066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.124752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.124800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.125261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.125312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.125844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.125897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.125942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.125986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.126039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.126630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.126642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.129292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.129341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.129385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.129850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.130323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.130373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.130417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.130462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.130507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.131004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.131017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.134205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.134674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.135135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.135599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.136185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.136655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.137117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.137584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.138045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.138536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.138552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.141632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.142097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.142563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.143036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.143474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.144720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.145832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.390 [2024-07-25 13:42:05.146293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.146756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.147212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.147225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.151497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.153324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.155158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.157006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.157407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.159057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.160880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.162714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.163732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.164282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.164295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.168244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.169987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.391 [2024-07-25 13:42:05.171629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.173457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.173800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.174349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.174817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.175280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.176938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.177381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.177393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.180881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.181345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.181811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.182750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.183174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.185020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.186852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.188361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.190256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.190645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.190658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.194128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.195775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.197595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.199415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.199883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.201760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.203423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.205248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.207072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.207565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.207578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.211874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.213826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.215131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.216780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.217113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.218950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.219938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.220404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.220871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.221307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.221319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.225232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.227067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.227830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.228292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.228880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.229989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.231639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.233474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.235250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.235733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.235745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.652 [2024-07-25 13:42:05.238304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.239083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.240729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.242564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.242901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.244428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.246321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.247667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.249501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.249844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.249857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.254395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.256208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.258138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.259400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.259791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.261627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.263446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.264631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.265093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.265642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.265654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.269726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.271601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.273420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.275243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.275733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.276201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.276667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.278082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.279640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.279975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.279986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.282363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.282836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.283336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.285177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.285513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.287229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.289007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.290622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.292268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.292607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.292620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.296663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.298486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.300333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.302132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.302516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.304171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.305998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.307822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.308795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.309324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.309336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.313324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.315018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.316667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.318500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.318840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.319362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.319829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.320293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.322238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.322642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.322655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.326118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.326589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.327052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.327873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.328263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.330113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.331947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.333616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.335314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.335666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.335683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.339253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.340902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.342669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.344492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.344953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.346902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.348652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.350471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.352319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.352782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.352794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.357092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.359037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.360360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.361995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.362328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.653 [2024-07-25 13:42:05.364174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.365165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.365633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.366094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.366495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.366508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.370428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.372264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.372807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.373268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.373842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.375081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.376719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.378536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.380358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.380828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.380842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.383383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.384262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.385897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.387715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.388053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.389462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.391418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.393159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.394989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.395324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.395337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.399845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.401817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.403715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.404951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.405322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.407154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.408972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.410274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.410746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.411296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.411308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.415110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.417061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.418978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.420830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.421313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.421787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.422248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.423595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.425226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.425565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.425578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.427960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.428423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.429076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.430786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.431121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.432945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.434602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.436336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.437973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.438305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.654 [2024-07-25 13:42:05.438317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.442562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.444503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.446452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.448296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.448790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.450445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.452274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.454118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.455917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.456464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.456476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.460579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.461809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.463450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.465308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.465647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.467286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.467752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.468214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.468937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.469303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.469314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.473012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.473650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.474110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.474573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.474922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.476719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.478559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.480381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.481602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.481991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.482003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.485014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.486835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.488730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.490528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.491028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.492677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.494440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.496367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.496833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.497322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.497336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.500703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.502361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.504206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.506034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.506475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.506945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.507407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.507872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.508335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.508909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.508922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.917 [2024-07-25 13:42:05.512004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.512472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.512937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.513399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.513872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.514337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.514811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.515273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.515744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.516253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.516266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.519344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.519827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.520290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.520755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.521231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.521700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.522162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.522637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.523096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.523593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.523606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.526668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.527134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.527599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.528060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.528636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.529102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.529569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.530033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.530494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.530992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.531006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.534077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.534541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.534592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.535054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.535516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.535984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.536446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.536910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.537371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.537912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.537925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.540914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.541377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.541843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.541891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.542434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.542903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.543365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.543832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.544306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.544817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.544830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.547571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.547619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.547664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.547709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.548272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.548329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.548374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.548418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.548462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.548965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.548980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.551773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.551820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.551864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.551909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.552449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.552499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.552544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.552593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.552639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.553127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.553139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.555870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.555918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.555961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.556006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.556458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.556507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.556557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.556601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.918 [2024-07-25 13:42:05.556645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.557159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.557171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.560930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.561437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.561449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.564268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.564316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.564360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.564413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.564914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.564969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.565014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.565058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.565101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.565621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.565634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.568521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.568572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.568621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.568679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.569270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.569321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.569365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.569410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.569453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.569951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.569964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.572712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.572759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.572803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.572858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.573366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.573415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.573460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.573504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.573551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.574052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.574064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.576786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.576833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.576882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.576927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.577486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.577535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.577583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.577628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.577672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.578184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.578200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.580988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.581824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.582274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.582286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.585882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.586397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.586409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.588505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.588556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.588602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.588650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.588984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.589033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.589078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.589122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.589171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.589504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.589516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.919 [2024-07-25 13:42:05.592146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.592802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.593186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.593198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.595294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.595342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.595386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.595430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.595951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.596006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.596051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.596096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.596140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.596562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.596574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.598724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.598774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.598819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.598862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.599192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.599241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.599289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.599333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.599377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.599929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.599942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.602929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.603320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.603333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.605669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.605717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.605765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.605810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.606283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.606331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.606376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.606420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.606463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.606827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.606839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.608822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.608869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.608912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.608960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.609459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.609512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.609561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.609606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.609652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.610160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.610172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.612987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.613031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.613358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.613370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.615799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.615846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.615890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.615934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.616295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.616344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.616388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.616441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.616485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.616816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.616828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.619004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.619055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.619104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.619149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.920 [2024-07-25 13:42:05.619674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.619723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.619777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.619822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.619866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.620406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.620418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.622978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.623023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.623416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.623428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.626730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.627193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.627205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.629419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.629466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.629510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.629568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.630148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.630197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.630242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.630286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.630330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.630701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.630713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.632762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.632811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.632859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.632903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.633237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.633286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.633330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.633374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.633429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.633997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.634009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.636311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.636359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.636403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.636447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.636931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.636984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.637029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.637077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.637121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.637508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.637520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.639858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.639907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.639953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.639999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.640379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.640427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.640471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.640516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.640564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.640896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.640908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.642809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.642857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.642901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.642945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.643495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.643545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.643593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.643637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.643681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.644164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.644176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.921 [2024-07-25 13:42:05.646333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.646381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.646429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.646472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.646872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.646921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.646966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.647010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.647058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.647389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.647401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.650684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.651015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.651027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.653920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.654422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.654435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.656438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.656485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.658306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.658353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.658686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.658737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.658782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.658826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.658870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.659410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.659422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.661769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.661817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.661861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.663681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.664066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.664116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.664162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.664206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.664250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.664584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.664596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.669060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.670760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.672584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.674418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.674871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.676518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.678404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.680350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.682307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.682818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.682831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.686957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.688620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.690349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.691988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.692324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.694155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.694759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.695220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.695683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.696050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.696062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.699811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.701290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.701755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.702216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.702739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:24.922 [2024-07-25 13:42:05.704695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.706453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.708268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.710097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.710545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.710562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.713247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.714260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.715903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.717735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.718068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.719388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.721340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.723144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.724967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.725305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.725317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.729725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.731559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.733395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.734922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.735309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.737150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.738981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.739911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.740375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.740934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.740947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.744627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.746454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.748287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.749988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.750553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.751019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.751480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.753245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.754896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.755229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.755241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.757745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.758211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.759106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.760785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.761124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.762946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.764176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.765932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.767884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.768220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.768232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.772413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.774250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.776069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.777821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.778217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.779865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.781693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.783518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.784395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.784942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.784955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.788659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.790620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.792428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.794248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.794587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.795053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.795515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.795979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.797925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.798343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.798355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.801797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.802262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.802728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.803633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.804045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.805880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.807717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.809369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.811106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.811456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.184 [2024-07-25 13:42:05.811469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.815143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.816793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.818624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.820441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.820932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.822854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.824535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.826361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.828175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.828656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.828669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.832927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.833933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.835762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.837577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.838054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.838520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.838987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.839756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.841400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.841740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.841753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.844174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.844648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.845111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.845575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.846149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.846617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.847085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.847549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.848010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.848496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.848509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.851644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.852107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.852571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.853034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.853536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.854005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.854466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.854929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.855391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.855925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.855944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.859082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.859564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.860027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.860487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.861019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.861491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.861956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.862427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.862890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.863450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.863462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.866552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.867020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.867484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.867948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.868482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.868951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.869415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.869879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.870343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.870817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.870830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.873970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.874434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.874899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.875361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.875932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.876397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.876862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.877323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.877789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.878276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.878288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.881092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.881572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.882981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.883921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.884422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.886365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.886832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.887297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.888197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.888613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.888625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.892630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.893454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.894406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.895821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.896302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.896771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.897801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.899134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.899598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.899937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.899949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.903503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.904785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.905247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.905711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.906129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.906728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.907905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.909091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.909559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.910176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.910189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.913038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.913973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.915405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.915868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.916218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.916685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.917147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.918347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.919525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.920040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.920053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.923010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.924182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.925361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.925827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.926395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.927607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.928772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.929371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.185 [2024-07-25 13:42:05.931150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.931724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.931737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.935961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.936425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.936893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.938110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.938600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.939064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.941003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.941464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.942599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.943044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.943056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.945701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.946171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.946640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.947102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.947685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.948152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.948617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.949077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.949542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.949966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.949979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.953966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.955941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.956404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.956871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.957437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.958767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.960400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.962227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.964047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.964520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.964532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.967101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.967768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.969439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.971392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.186 [2024-07-25 13:42:05.971731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.973526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.975143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.976775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.978597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.978933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.978945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.983488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.985377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.987214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.988446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.988805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.990635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.992460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.993808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.994279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.994769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.994782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:05.998831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.000530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.002352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.004187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.004662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.005126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.005594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.006776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.008406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.008745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.008757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.011188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.011656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.450 [2024-07-25 13:42:06.012242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.013784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.014117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.015953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.017552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.019344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.021000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.021334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.021346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.025680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.027652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.029515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.031357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.031866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.033524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.035331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.037193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.039148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.039707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.039720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.043889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.045121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.045168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.046809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.047143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.048978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.050591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.051052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.051512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.052042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.052055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.055809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.057659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.059612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.059659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.060175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.060643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.061114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.062828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.064481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.064828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.064841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.066952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.067831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.068419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.068431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.070512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.070563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.070609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.070653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.070984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.071033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.071078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.071122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.071166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.071495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.071507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.074757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.075239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.075251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.077462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.077510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.077559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.077603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.078126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.078175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.078221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.078265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.078309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.078693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.078706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.080737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.080785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.080828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.451 [2024-07-25 13:42:06.080872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.081202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.081255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.081299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.081343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.081387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.081910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.081923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.084996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.085405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.085417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.087846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.087894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.087938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.087982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.088324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.088374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.088427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.088471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.088515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.088849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.088861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.090819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.090867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.090915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.090969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.091555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.091605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.091650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.091694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.091738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.092262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.092275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.094433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.094480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.094524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.094572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.094906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.094959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.095004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.095047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.095091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.095421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.095433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.098747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.099198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.099210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.101410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.101460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.101504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.101552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.102108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.102158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.102203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.102252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.102297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.102676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.102688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.104743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.104790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.104838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.104882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.105211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.105259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.105304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.105348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.105392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.105942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.105954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.108308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.108356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.108399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.108443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.108944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.108997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.109042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.452 [2024-07-25 13:42:06.109086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.109130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.109485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.109498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.111860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.111908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.111952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.111996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.112344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.112397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.112442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.112486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.112530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.112864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.112876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.114889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.114937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.114993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.115036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.115611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.115661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.115706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.115750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.115794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.116354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.116365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.118502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.118554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.118603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.118647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.119021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.119070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.119115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.119159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.119203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.119535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.119551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.122846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.123276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.123288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.125463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.125511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.125558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.125602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.126184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.126232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.126277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.126321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.126365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.126739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.126751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.128785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.128833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.128877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.128921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.129250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.129299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.129343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.129387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.129431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.130018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.130030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.132376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.132424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.132472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.132516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.133053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.133102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.133147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.133190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.133234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.133604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.133617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.136998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.137011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.453 [2024-07-25 13:42:06.138943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.138991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.139038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.139082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.139695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.139745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.139789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.139833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.139884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.140417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.140430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.142634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.142682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.142726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.142770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.143148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.143196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.143241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.143285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.143329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.143661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.143674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.146526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.146578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.146627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.146672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.147005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.147053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.147098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.147141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.147185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.147521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.147533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.149673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.149722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.149766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.149809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.150365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.150419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.150470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.150516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.150565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.150976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.150988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.153760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.154308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.154320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.156657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.156708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.156753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.156796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.157156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.157205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.157250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.157294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.157338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.157671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.157683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.160857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.161187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.161199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.163163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.163211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.163255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.163299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.163881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.163934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.163979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.164023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.164068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.164588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.164601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.166752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.166799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.166843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.166887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.454 [2024-07-25 13:42:06.167266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.167315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.167359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.167403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.167448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.167781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.167794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.170554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.170606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.171070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.171116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.171638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.171688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.171734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.171778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.171822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.172328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.172340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.175028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.175077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.175122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.175586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.176157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.176206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.176253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.176297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.176342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.176864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.176877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.180012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.180477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.180942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.181401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.181922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.182386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.182857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.183318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.183780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.184306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.184322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.187438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.187907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.188368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.188832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.189380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.189848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.190320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.190784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.191245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.191744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.191757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.195097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.195567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.196028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.196488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.197041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.197507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.197973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.198432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.198896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.199374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.199386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.202540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.203021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.203483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.203946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.204466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.204934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.205395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.205862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.206323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.206870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.206883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.210061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.210528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.211002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.455 [2024-07-25 13:42:06.211462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.211973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.212439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.212904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.213364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.213827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.214328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.214340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.217438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.217906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.218367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.218831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.219396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.219871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.220333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.220796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.221267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.221813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.221826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.224988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.225454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.225918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.226380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.226899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.227376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.227841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.228302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.228767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.229316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.229328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.232366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.232848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.233310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.233776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.234139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.234946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.236678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.237139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.456 [2024-07-25 13:42:06.237607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.238118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.238131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.241302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.241770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.242232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.242832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.243183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.245019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.246848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.248782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.250224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.250638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.250651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.254202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.255846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.257682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.259511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.259988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.261897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.263562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.265380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.267206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.267732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.267744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.271754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.273707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.275020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.276662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.276996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.278829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.279831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.280291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.280754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.281155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.281167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.285182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.287082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.287544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.288007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.288521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.290120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.291767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.293593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.295397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.295893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.295905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.298649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.299816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.301449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.303269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.303608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.304850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.306675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.308636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.310538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.310876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.310888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.315319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.317144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.318614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.320436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.320806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.322635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.324449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.325201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.325664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.326256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.326268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.329955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.331785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.333621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.334445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.718 [2024-07-25 13:42:06.334976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.335443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.336529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.338166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.340007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.340344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.340356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.343070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.343541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.345226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.346862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.347196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.349042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.350256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.351892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.353733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.354066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.354078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.358556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.360387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.362217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.363442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.363853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.365675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.367502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.368946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.369407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.369923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.369935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.374072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.375940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.377755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.379575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.380029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.380494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.380965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.382336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.383973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.384307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.384319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.386758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.387223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.388047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.389695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.390030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.391864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.393318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.395260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.396935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.397267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.397279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.401516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.403351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.405176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.406904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.407299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.408944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.410770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.412600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.413468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.414004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.414018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.417566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.419220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.421039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.422869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.423283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.423754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.424216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.425010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.426646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.426982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.426994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.429394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.429865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.430414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.432233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.432582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.434423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.436292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.437812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.439448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.439785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.439797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.443914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.445747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.447572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.449469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.449954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.451611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.453437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.455267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.719 [2024-07-25 13:42:06.456587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.457143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.457156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.461279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.462953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.464601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.466422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.466764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.467395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.467861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.468320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.470154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.470528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.470540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.473765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.474231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.474696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.475931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.476296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.478128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.479953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.481181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.483087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.483422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.483433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.487624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.489261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.491092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.492910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.493394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.495253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.497208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.499087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.500925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.501433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.501450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.505607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.720 [2024-07-25 13:42:06.506997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.982 [2024-07-25 13:42:06.508953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.510682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.511017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.512947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.513409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.513876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.514335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.514709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.514722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.518463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.519291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.519757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.520218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.520605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.522245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.524079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.525898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.527114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.527491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.527503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.530681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.532319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.534159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.536031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.536441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.538105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.539741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.541564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.543386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.543910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.543923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.548317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.550287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.551552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.553190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.553524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.555362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.556434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.556898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.557358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.557781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.557793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.561857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.563755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.563802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.564262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.564778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.565245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.566629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.568453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.570281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.570767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.570779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.573612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.575569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.577382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.577429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.577766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.579659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.581180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.582822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.584644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.584988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.585000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.587619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.587667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.587711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.587754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.588086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.588134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.588179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.588223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.588267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.588702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.588715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.590930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.590978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.591023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.591066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.591620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.591669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.591714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.591758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.591802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.592297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.592309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.595180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.595229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.595284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.595332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.983 [2024-07-25 13:42:06.595887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.595936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.595982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.596026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.596071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.596552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.596564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.599260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.599308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.599352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.599396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.599905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.599955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.600002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.600053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.600098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.600578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.600591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.603444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.603492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.603538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.603586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.604108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.604160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.604205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.604251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.604295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.604866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.604879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.607573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.607622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.607666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.607711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.608194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.608243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.608288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.608339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.608387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.608926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.608938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.611638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.611686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.611730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.611774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.612274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.612322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.612367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.612411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.612455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.612966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.612979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.615701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.615755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.615803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.615847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.616331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.616380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.616434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.616478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.616524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.617064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.617076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.619826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.619876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.619920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.619970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.620539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.620597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.620643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.620687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.620731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.621204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.621216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.624993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.625039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.625528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.625540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.628270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.628318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.628362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.628406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.628899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.628953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.984 [2024-07-25 13:42:06.629002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.629054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.629099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.629667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.629680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.632430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.632478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.632526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.632578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.633126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.633175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.633220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.633264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.633310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.633870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.633883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.636631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.636678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.636722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.636766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.637229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.637284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.637329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.637375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.637419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.637902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.637915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.640560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.640611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.640655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.640699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.641181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.641230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.641274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.641318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.641363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.641868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.641881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.644671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.644727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.644779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.644823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.645292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.645342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.645387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.645442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.645486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.646040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.646053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.648771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.648821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.648870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.648924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.649520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.649576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.649624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.649668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.649712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.650227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.650239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.653871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.653927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.653971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.654015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.654500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.654558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.654604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.654649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.654694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.655247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.655260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.658962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.659791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.660302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.660314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.664840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.665424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.985 [2024-07-25 13:42:06.665443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.986 [2024-07-25 13:42:06.670500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.986 [2024-07-25 13:42:06.670555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.986 [2024-07-25 13:42:06.671201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.986 [2024-07-25 13:42:06.671248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.986 [2024-07-25 13:42:06.671293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:25.986 [2024-07-25 13:42:06.671786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.894865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.904795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.904860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.905293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.905345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.906665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.906718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.908327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.908668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.908680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.908691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.919112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.920953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.922767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.923209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.923221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.926049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.927396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.929031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.930858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.932422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.934076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.935969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.937940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.938298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.938310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.942652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.944329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.946162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.947984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.950307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.952272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.954141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.955960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.956461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.956474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.960610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.962365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.964014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.965666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.967818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.968960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.970486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.970949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.971290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.971302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.975100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.976847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.978103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.978567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.979660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.981418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.983368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.985346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.985692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.985704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.988107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.989655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.990468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.991576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.993746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.995570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.997058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.998959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.999353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:06.999365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.003009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.004651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.006490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.008319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.010414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.012072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.013915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.015733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.016179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.016191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.019056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.019525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.019990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.020037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.020981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.249 [2024-07-25 13:42:07.022584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.023332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.024435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.024926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.024938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.028091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.028142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.028607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.028653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.029442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.029490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.031216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.031263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.031800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.031812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.034885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.034935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.035520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.035570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.036432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.036479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.038392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.250 [2024-07-25 13:42:07.038440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.038987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.039001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.041989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.042038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.042913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.042960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.043929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.043977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.045618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.045665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.046179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.046196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.049260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.049316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.050539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.050592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.512 [2024-07-25 13:42:07.051702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.051750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.053182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.053229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.053729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.053742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.056842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.056902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.058251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.058298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.059470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.059518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.060926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.060974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.061473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.061485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.064559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.064620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.065977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.066024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.067226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.067275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.068622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.068668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.069176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.069189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.072320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.072370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.073804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.073850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.075088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.075136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.076484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.076532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.077042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.077055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.080156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.080207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.081608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.081654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.082906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.082955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.084230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.084278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.084782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.084795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.087930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.087979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.089478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.089525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.090765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.090814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.092116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.092162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.092691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.092705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.095791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.095841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.097366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.097413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.098633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.098681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.100004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.100051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.100561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.100574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.104483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.104539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.106158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.106204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.107174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.107223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.107687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.107733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.108253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.108265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.111193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.111243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.113105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.113153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.114108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.114157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.114620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.114667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.115168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.115180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.118482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.118537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.119838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.119884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.513 [2024-07-25 13:42:07.122012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.122060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.122907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.122954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.123444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.123457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.127843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.127893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.128355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.128401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.129522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.129576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.130035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.130091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.130611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.130623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.133496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.133550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.135488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.135535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.137031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.137080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.138194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.138243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.138761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.138774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.142511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.142569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.144390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.144436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.146818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.146867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.147325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.147371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.147726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.147738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.151247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.151297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.152853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.152900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.155083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.155132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.156428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.156474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.156996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.157009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.161163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.161213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.163174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.163223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.165322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.165371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.167183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.167230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.167570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.167583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.171338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.171387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.173035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.173082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.175245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.175294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.176811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.176858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.177227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.177239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.181111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.181161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.181626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.181672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.183870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.183919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.185741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.185788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.186228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.186240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.188717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.188767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.190666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.190713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.192531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.192585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.194225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.194272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.194611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.194624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.198394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.198444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.198913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.198959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.199790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.199839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.514 [2024-07-25 13:42:07.201465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.201512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.201907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.201919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.205346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.205395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.207328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.207377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.208919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.208968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.210016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.210063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.210553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.210566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.214408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.214458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.214502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.214534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.216123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.216172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.216216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.216682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.217090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.217102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.219494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.219543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.219591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.219641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.220119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.220165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.220210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.220255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.220590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.220602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.223953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.224291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.224305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.226934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.227267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.227279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.229646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.229694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.229738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.229786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.230160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.230207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.230250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.230294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.230787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.230799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.232871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.232920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.232964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.233008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.233541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.233592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.233636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.233680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.234213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.234226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.236334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.236382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.236426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.236470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.236907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.236954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.236999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.237043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.237373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.237385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.239865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.239913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.239958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.240002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.240408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.240455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.240500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.240544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.240880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.515 [2024-07-25 13:42:07.240892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.242840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.242888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.244538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.244589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.245165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.245213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.245257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.245301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.245690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.245702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.249294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.249344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.250977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.251023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.251397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.253226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.253272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.254427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.254937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.254949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.259118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.259166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.260993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.261040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.261536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.263186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.263232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.265045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.265380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.265392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.268833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.268882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.270522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.270572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.270946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.272855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.272902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.274304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.274706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.274718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.278795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.278844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.279304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.279349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.279759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.281719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.281766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.283639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.284030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.284041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.286406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.286455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.288317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.288363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.288979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.290204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.290251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.291892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.292227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.292239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.295723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.295771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.296251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.296297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.296728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.297292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.297339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.298879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.299266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.516 [2024-07-25 13:42:07.299278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.303134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.303184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.304743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.304790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.305349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.306271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.306318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.307446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.307973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.307985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.311761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.311818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.313637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.313684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.314217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.316160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.316212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.316674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.317021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.317032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.320554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.320603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.322242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.322288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.322667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.324627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.324675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.779 [2024-07-25 13:42:07.326138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.326645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.326657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.330831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.330879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.332683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.332729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.333250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.334896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.334942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.336766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.337103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.337115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.340745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.340793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.342425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.342472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.342853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.344801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.344851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.346356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.346764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.346776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.350913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.350961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.351421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.351466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.351865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.353827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.353874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.355757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.356147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.356158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.358535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.358588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.360381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.360427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.361021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.362153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.362200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.363837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.364170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.364182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.367635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.367685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.368169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.368215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.368640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.369174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.369220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.370795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.371159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.371171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.375046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.375095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.376639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.376686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.377240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.378171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.378218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.379329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.379852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.379864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.383675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.383724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.385534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.385583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.386130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.388063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.388110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.388574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.388929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.388940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.392402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.392450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.394076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.394123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.394496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.396450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.396496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.397875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.398368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.398380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.402540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.402592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.403530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.403580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.403954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.405775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.405822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.406944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.780 [2024-07-25 13:42:07.407430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.407442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.411618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.411667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.413496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.413542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.414063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.415713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.415768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.417588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.417924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.417936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.421453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.421501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.423132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.423179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.423551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.425497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.425549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.426980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.427388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.427399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.431436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.431485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.431948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.431995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.432418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.434363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.434410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.436237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.436622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.436635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.439309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.439358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.440636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.440682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.441257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.441722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.441769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.442541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.442959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.442971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.446800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.446849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.447525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.447566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.448042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.449060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.449107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.449570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.450093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.450108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.452465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.452931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.454329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.454375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.455033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.455458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.455508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.456643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.457104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.457150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.457639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.457652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.460406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.461563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.462793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.463382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.463784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.464248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.464719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.466118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.467088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.467565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.467577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.470434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.471997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.472799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.473261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.473804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.475483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.476148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.477286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.478516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.479055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.479067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.482809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.483273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.483736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.485671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.486199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.487020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.781 [2024-07-25 13:42:07.488569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.489030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.489489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.489918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.489930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.493096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.495022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.495483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.496847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.497335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.497803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.498271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.499986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.500606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.501057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.501069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.504010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.505809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.506379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.506845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.507374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.509210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.509746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.510980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.512097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.512612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.512625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.516153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.516622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.517084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.519019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.519600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.520507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.521963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.522422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.522885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.523272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.523284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.526488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.528329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.528880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.530095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.530602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.531066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.531536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.533165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.533863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.534334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.534346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.537256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.537309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.539083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.539129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.539677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.540141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.540605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.542536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.543004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.543398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.543409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.546445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.546499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.548429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.548477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.549045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.549509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.549559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.550246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.550291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.550724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.550737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.554486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.554534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.555200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.555247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.555684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.557005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.557052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.557512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.557561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.558058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.558077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.561542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.561596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.562398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.562443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.562861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.564510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.564561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.565021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.565067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.565444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:26.782 [2024-07-25 13:42:07.565455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.568507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.568562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.570493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.570540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.571076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.571540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.571591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.572412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.572458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.572942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.572954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.576805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.576853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.577478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.577523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.577952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.579092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.579139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.579602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.579652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.580186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.580198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.584423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.584472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.586282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.586328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.586665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.588630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.588678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.590270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.590317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.590713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.590726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.594670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.594719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.595179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.595225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.595584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.597412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.597459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.599282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.599329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.599783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.599796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.602273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.602330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.604280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.604326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.604896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.606170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.606217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.607867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.607913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.608246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.608259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.611948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.611996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.612455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.612500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.612874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.613444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.613491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.615006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.615052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.615426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.615438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.619084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.619133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.620917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.620963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.621498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.622434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.622481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.623607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.623653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.624147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.624159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.627987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.628036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.629861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.046 [2024-07-25 13:42:07.629912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.630398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.632348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.632396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.632860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.632908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.633244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.633256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.636824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.636872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.638511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.638562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.638895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.640859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.640907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.642261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.642308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.642835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.642849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.646898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.646947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.648860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.648907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.649348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.650995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.651042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.652872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.652918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.653250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.653263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.657004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.657053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.658707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.658753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.659088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.661059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.661107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.662550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.662596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.662958] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.662970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.666905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.666954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.667413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.667459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.667835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.669736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.669784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.671610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.671656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.672137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.672149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.674549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.674600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.676540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.676590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.677154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.678286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.678333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.680005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.680053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.680395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.680407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.684156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.684205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.684669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.684715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.685085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.685640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.685687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.687217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.687263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.687646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.687659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.691181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.691230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.693097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.693143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.693735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.694653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.694700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.695827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.695873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.696370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.696382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.700188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.700237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.702069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.702115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.702627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.704577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.047 [2024-07-25 13:42:07.704628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.705086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.705132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.705473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.705485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.708981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.709030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.710681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.710727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.711062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.713032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.713079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.714412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.714459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.715010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.715023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.719162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.719210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.721038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.721085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.721534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.723180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.723226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.725052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.725099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.725430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.725442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.729093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.729142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.730787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.730834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.731174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.733123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.733171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.734622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.734668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.735030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.735042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.738541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.738593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.740231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.742056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.742389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.743636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.743683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.745492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.745540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.745877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.745889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.748362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.748409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.750052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.750099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.750429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.752265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.752312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.752357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.753827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.754236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.754249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.756611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.756662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.756707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.756751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.757119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.757168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.757213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.757260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.757304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.757639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.757652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.759579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.759628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.759671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.759715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.760223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.760273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.760320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.760366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.760413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.760969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.760981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.763818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.764148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.764163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.048 [2024-07-25 13:42:07.766943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.766990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.767950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.770833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.771301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.771313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.773989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.774037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.774575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.774587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.777997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.778009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.780327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.780794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.780841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.782508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.782848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.782898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.782942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.782986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.783031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.783360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.783372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.785447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.785914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.785960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.786683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.787060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.787117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.788930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.788977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.790808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.791281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.791293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.793456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.793923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.793971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.795238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.795621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.795670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.797507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.797558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.799381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.799854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.799866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.802126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.802593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.802642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.804107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.804510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.804564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.806383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.049 [2024-07-25 13:42:07.806429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.808317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.808785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.808797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.811121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.811586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.811634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.813163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.813557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.813606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.815391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.815437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.817399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.817902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.817914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.820207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.820673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.820719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.822241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.822625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.822674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.824433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.824479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.825788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.826164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.826176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.828642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.830595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.830641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.832607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.832941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.050 [2024-07-25 13:42:07.832989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.312 [2024-07-25 13:42:07.834664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.834712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.836593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.836979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.836991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.839372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.840755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.840801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.842435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.842774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.842822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.844780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.844827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.846249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.846664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.846676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.849065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.850069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.850115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.851757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.852092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.852141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.853969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.854016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.855313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.855698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.855711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.858062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.859096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.859143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.860778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.861116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.861164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.862989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.863036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.864101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.864459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.864473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.866970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.867433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.867479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.867943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.868478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.868529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.868992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.869038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.869497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.869984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.869997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.872780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.873245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.873292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.873757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.874281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.874331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.874795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.874845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.875304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.875854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.875866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.878568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.879032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.879081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.879541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.880103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.880154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.880623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.880669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.881128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.881691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.881703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.885320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.885790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.885837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.886297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.886872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.886922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.887382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.887427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.887891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.888382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.888394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.891043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.891507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.891558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.892018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.892519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.892572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.893033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.893079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.893540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.894007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.313 [2024-07-25 13:42:07.894019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.897553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.898019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.898066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.898526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.899054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.899104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.899568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.899614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.900074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.900585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.900598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.903290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.903756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.903803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.904262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.904824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.904874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.905333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.905378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.905841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.906331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.906343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.909993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.910459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.910506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.910969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.911514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.911566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.912026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.912072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.912534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.913027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.913039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.915804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.916267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.916318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.916781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.917346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.917394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.917857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.917911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.918375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.918907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.918920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.922499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.922969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.923016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.923475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.924027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.924077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.924537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.924588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.925049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.925562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.925575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.928381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.928852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.928901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.930282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.930735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.930785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.932709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.932756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.933220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.933733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.933749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.937267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.937746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.937794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.938253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.938804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.938854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.939314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.939361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.939826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.940341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.940353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.943162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.943638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.943686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.945390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.945748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.945798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.947702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.947749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.949694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.950182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.950193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.954615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.956258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.314 [2024-07-25 13:42:07.956305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.956349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.956685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.956736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.958689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.958735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.960256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.960671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.960684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.964085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.965726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.965772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.967596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.967932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.967981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.969212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.970859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.970905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.971235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.971246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.977746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.979252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.981143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.982807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.983140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.984977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.985490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.985952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.986412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.986750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.986762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.990451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.991882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.992350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.992814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.993297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.995078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.997033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:07.998908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.000840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.001325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.001337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.007310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.009175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.011136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.013091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.013544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.015163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.016993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.018810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.020254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.020788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.020800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.024969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.026214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.027849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.029666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.029999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.031119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.031585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.032044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.033369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.033751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.033763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.039887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.040373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.041888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.043538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.043876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.045705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.046935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.048581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.050387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.050725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.050738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.055118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.056935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.058766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.060009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.060411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.062364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.064327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.066261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.066726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.067240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.067252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.073448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.075282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.076378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.076841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.077399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.078157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.315 [2024-07-25 13:42:08.079801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.081662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.083611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.083977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.083989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.086497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.086550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.087325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.087371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.087732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.089566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.091386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.092858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.094824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.095233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.316 [2024-07-25 13:42:08.095245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.101740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.101792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.103625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.103672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.104183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.105836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.105883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.107647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.107693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.108023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.108035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.112555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.112606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.114560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.114608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.114995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.116661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.116708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.118413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.118459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.118801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.118813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.125149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.125200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.126605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.126652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.127007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.128837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.128884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.130749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.130795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.131355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.131367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.135500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.135552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.136774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.136820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.137219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.139052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.139098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.140401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.140448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.140951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.140964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.147210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.147260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.149094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.149141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.149472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.151387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.151434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.151900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.151947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.152475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.152487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.156483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.156532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.158358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.158404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.158740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.160691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.160739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.161198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.161243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.161750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.161763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.579 [2024-07-25 13:42:08.168031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.168083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.169815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.169864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.170397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.170865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.170913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.171401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.171447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.171830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.171842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.175551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.175600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.176060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.176106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.176594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.177059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.177105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.179025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.179072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.179409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.179420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.185673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.185723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.186183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.186229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.186590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.188412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.188458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.190277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.190323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.190782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.190794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.193382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.193433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.194432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.194478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.194866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.196701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.196748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.198569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.198615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.199080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.199093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.205081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.205132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.206967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.207014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.207346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.208580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.208628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.210256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.210302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.210637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.210650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.214837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.214886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.216700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.216746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.217077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.218307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.218355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.220001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.220047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.220379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.220391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.227068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.227120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.228452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.228498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.228835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.230730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.230777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.231237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.231283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.231762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.231779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.235513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.235565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.237271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.237317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.237653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.239617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.239665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.240124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.240170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.240677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.240689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.247086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.247144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.249118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.249165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.580 [2024-07-25 13:42:08.249713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.250177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.250223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.250685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.250737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.251138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.251149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.254846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.254895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.255539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.255590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.256102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.256583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.256632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.258009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.258060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.258411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.258423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.264791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.264843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.265305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.265351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.265824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.266289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.266337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.266800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.266846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.267393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.267405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.270359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.270423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.270888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.270934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.271420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.271888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.271935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.272394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.272445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.272996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.273008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.277041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.277093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.277560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.277606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.278118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.278592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.278639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.279097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.279155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.279692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.279706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.282710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.282770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.283230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.283275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.283773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.284237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.284284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.284746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.284792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.285308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.285321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.289276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.289337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.289803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.289850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.290366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.290835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.290882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.291340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.291385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.291949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.291963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.294989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.295042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.295503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.295560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.296071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.296551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.296600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.297060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.297121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.297669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.297682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.301701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.301754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.302214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.302686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.303218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.303688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.581 [2024-07-25 13:42:08.303747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.304207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.304253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.304776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.304789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.307588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.307637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.308096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.308143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.308610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.309075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.309122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.309167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.309632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.310151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.310164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.313749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.313803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.313848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.313892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.314367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.314418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.314462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.314506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.314555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.315069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.315081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.317727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.317775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.317820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.317870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.318400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.318456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.318502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.318550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.318595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.319117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.319129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.322650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.322701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.322750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.322795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.323246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.323294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.323339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.323395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.323439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.324016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.324028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.326411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.326459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.326507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.326565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.327153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.327202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.327247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.327293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.327337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.327875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.327887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.331358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.331418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.331462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.331506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.332046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.332100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.332145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.332190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.332239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.332828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.332840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.335523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.335574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.335619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.335664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.336172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.336221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.336265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.336314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.336369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.336897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.336909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.340604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.340655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.340699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.340743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.341079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.341128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.341173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.341216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.341260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.341594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.582 [2024-07-25 13:42:08.341606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.343553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.344016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.344062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.344521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.345014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.345063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.345108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.345152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.345197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.345554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.345567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.351391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.351860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.351907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.352365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.352714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.352769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.354718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.354767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.356675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.357044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.357056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.359146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.359612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.359659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.360660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.361031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.361079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.362911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.362957] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.364762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.365237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.583 [2024-07-25 13:42:08.365248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.369641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.371484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.371540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.373488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.373824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.373874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.375373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.375420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.377370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.845 [2024-07-25 13:42:08.377733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.377746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.380166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.381818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.381872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.383600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.383936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.383985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.385936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.385983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.387541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.387925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.387937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.392518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.394356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.394403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.396224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.396716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.396765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.398395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.398441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.400186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.400520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.400532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.403394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.405040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.405086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.406939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.407273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.407322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.408877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.408924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.410562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.410898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.410914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.415383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.417236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.417283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.418677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.419081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.419129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.420962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.421007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.422686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.423221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.423234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.425586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.427423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.427469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.428686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.429077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.429126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.430956] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.431003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.432791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.433282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.433294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.437685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.439643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.439689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.441604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.441940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.441989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.443831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.443878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.444341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.444845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.444858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.447039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.448896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.448942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.450637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.450973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.451022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.452895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.452941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.453401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.453876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.453889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.458367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.460252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.460299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.462047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.462639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.462689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.463149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.463194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.463657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.464049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.464061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.846 [2024-07-25 13:42:08.469552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.470019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.470065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.470524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.470963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.471013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.472575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.472622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.474445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.474781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.474793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.479983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.480874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.480921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.482570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.482906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.482955] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.484783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.484830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.486043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.486409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.486421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.490980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.492811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.492857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.494696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.495175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.495224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.497006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.497053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.498892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.499228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.499240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.503712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.505205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.505252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.507210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.507563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.507618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.509524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.509573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.511290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.511855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.511868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.516446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.518095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.518142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.519972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.520309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.520358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.520823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.520871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.521330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.521845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.521861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.527017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.528524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.528573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.529037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.529537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.529591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.530242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.530287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.531927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.532262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.532274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.538168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.538641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.538688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.539616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.540025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.540074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.541893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.541939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.543764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.544280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.544292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.548716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.550654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.550700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.552648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.552984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.553033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.554694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.554741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.556616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.556994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.557006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.561731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.563556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.563604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.847 [2024-07-25 13:42:08.565497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.565982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.566031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.567661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.567707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.569516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.569859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.569873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.574363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.575573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.575619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.577250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.577590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.577640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.579470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.579516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.580276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.580800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.580813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.585213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.587043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.587090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.588915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.589382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.589431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.589894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.589941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.590410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.590829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.590842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.596492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.596962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.597009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.597054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.597516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.597569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.599204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.599254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.601088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.601422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.601434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.606872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.608667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.608715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.610669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.611006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.611055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.612543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.614431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.614478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.614827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.614840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.621230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.623069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.624294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.625922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.626257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.628097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.629348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.629812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.630274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.630760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:27.848 [2024-07-25 13:42:08.630772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.637245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.637715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.638177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.639533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.639949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.641810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.643629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.644828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.646749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.647136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.647148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.652374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.652844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.653305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.653769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.654302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.654771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.655234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.655697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.656158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.111 [2024-07-25 13:42:08.656637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.656649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.660529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.660998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.661460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.661926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.662528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.662997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.663467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.663935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.664400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.664928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.664942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.668858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.669324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.669793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.670253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.670773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.671238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.671704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.672164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.672628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.673174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.673187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.677086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.677556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.678017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.678477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.678968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.679436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.679902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.680362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.680825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.681370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.681383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.685385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.685855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.686318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.686781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.687282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.687750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.688214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.688678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.689140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.689676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.689692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.693737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.693787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.694250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.694295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.694826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.695290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.695755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.696215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.696684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.697156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.697168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.701079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.701130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.701595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.701641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.702152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.702620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.702675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.703136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.703182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.703653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.703667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.707899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.707950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.709277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.709323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.709900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.710365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.710414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.710876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.710927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.711427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.711439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.715018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.715069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.715528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.715578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.716132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.716607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.716655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.717113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.717158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.112 [2024-07-25 13:42:08.717679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.717692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.721132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.721183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.721647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.721693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.722227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.723759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.723806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.725442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.725487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.725823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.725835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.730208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.730260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.730725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.730771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.731133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.733000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.733052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.734877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.734923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.735396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.735409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.740173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.740225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.741856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.741903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.742285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.744123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.744170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.746134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.746182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.746661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.746674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.750803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.750854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.752497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.752544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.752882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.754656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.754703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.755906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.755952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.756342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.756354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.760649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.760700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.762528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.762582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.762931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.764173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.764221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.765864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.765911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.766242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.766255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.770806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.770858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.772685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.772732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.773185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.774863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.774911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.776737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.776785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.777115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.777128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.781786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.781838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.783573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.783620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.783963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.785768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.785816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.787769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.787818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.788177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.788189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.792725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.792781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.794018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.794066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.794436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.796268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.796315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.798136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.798183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.798680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.798693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.803748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.803799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.113 [2024-07-25 13:42:08.805642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.805689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.806043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.808027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.808076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.810030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.810078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.810588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.810602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.815734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.815785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.817727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.817776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.818112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.820072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.820126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.822090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.822137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.822685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.822698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.827820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.827871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.829834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.829882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.830213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.832152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.832200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.833944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.833991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.834523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.834538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.839671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.839721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.841495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.841543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.841883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.843731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.843778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.844993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.845041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.845632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.845645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.850943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.850995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.852625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.852672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.853006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.854845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.854892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.855354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.855402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.855900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.855913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.861599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.861651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.863617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.863666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.864001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.865756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.865803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.866264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.866310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.866807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.866822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.872223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.872274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.874093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.874139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.874475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.874944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.874991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.875451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.875497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.876017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.876032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.881529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.881585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.883403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.883450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.883904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.884371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.884418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.884882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.884930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.885316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.885328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.890185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.890237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.890703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.890750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.114 [2024-07-25 13:42:08.891243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.891711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.891758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.893551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.893598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.893976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.893988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.898030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.898082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.898542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.898592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.115 [2024-07-25 13:42:08.899122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.900881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.900930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.902690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.902736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.903069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.903081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.907347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.907398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.907872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.907919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.908275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.910019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.910066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.911978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.912028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.912376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.912390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.916610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.916661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.917714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.919358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.919699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.921542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.921593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.922825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.922872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.923227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.923239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.925837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.925888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.927568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.927616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.927949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.929794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.929842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.929887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.931111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.931498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.931510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.934795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.935125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.935138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.378 [2024-07-25 13:42:08.937834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.937879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.938408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.938422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.940329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.940378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.940423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.940472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.940886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.940937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.940982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.941027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.941075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.941408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.941420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.943638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.943689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.943737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.943782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.944126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.944175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.944219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.944263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.944306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.944639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.944651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.946465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.946514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.946563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.946608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.947131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.947186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.947231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.947275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.947319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.947812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.947824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.949709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.949759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.949803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.949848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.950179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.950229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.950278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.950323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.950367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.950701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.950714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.952902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.952953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.952998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.953931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.955858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.956324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.956371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.956855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.957226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.957275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.957320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.957365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.957408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.957744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.957758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.959494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.960105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.960154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.960622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.961161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.961217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.962728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.962775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.964418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.964759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.964771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.966482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.967361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.967408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.967872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.968447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.968498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.969686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.969733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.971375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.971715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.379 [2024-07-25 13:42:08.971728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.973394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.974560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.974608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.975068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.975605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.975665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.976626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.976673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.978299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.978638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.978651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.980365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.981885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.981933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.982403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.982897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.982947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.983511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.983562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.985191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.985525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.985537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.987265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.989098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.989146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.989610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.990121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.990171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.990637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.990684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.991143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.991675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.991689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.993871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.994337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.994384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.994848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.995358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.995408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.995875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.995922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.996380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.996904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.996917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.999032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.999497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:08.999543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.000008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.000573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.000634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.001097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.001143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.001607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.002125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.002138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.004394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.004864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.004925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.005385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.005939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.005991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.006451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.006496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.006960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.007480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.007493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.009636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.010101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.010151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.010615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.011125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.011174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.011639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.011690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.012150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.012720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.012733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.014836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.015301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.015347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.015811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.016367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.016418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.016882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.016928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.017395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.017914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.017927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.020146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.020615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.380 [2024-07-25 13:42:09.020662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.021134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.021628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.021679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.022138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.022185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.022649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.023174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.023186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.025099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.025570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.025616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.026076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.026639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.026693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.027154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.027200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.027664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.028179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.028191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.030271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.030744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.030791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.031251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.031739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.031789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.032248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.032294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.032758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.033242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.033254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.035206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.035677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.035725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.036185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.036751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.036802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.037264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.037311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.037774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.038291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.038303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.040506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.040977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.041029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.041489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.041988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.042039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.042499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.042561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.043022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.043593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.043606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.045378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.045849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.045896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.046355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.046878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.046938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.047399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.047445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.047910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.048414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.048427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.050121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.051045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.051093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.051558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.052145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.052197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.052661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.052708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.053173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.053709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.053722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.055670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.056134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.056181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.056645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.057129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.057179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.057651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.057698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.058156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.058734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.058748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.060562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.061028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.061074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.061534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.381 [2024-07-25 13:42:09.062006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.062056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.062516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.062567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.063027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.063649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.063661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.065464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.067015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.067062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.068704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.069042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.069092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.071062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.071110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.382 [2024-07-25 13:42:09.072180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:28.953 00:33:28.953 Latency(us) 00:33:28.953 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:28.953 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x0 length 0x100 00:33:28.953 crypto_ram : 5.78 44.32 2.77 0.00 0.00 2804821.07 324251.96 2193943.63 00:33:28.953 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x100 length 0x100 00:33:28.953 crypto_ram : 5.90 33.75 2.11 0.00 0.00 3511572.40 58074.98 3019898.88 00:33:28.953 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x0 length 0x100 00:33:28.953 crypto_ram1 : 5.78 44.31 2.77 0.00 0.00 2716000.89 324251.96 2039077.02 00:33:28.953 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x100 length 0x100 00:33:28.953 crypto_ram1 : 5.93 37.45 2.34 0.00 0.00 3113081.72 65334.35 2787598.97 00:33:28.953 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x0 length 0x100 00:33:28.953 crypto_ram2 : 5.54 300.17 18.76 0.00 0.00 383988.94 24298.73 471052.60 00:33:28.953 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x100 length 0x100 00:33:28.953 crypto_ram2 : 5.67 216.89 13.56 0.00 0.00 521457.28 28634.19 616240.05 00:33:28.953 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x0 length 0x100 00:33:28.953 crypto_ram3 : 5.65 316.79 19.80 0.00 0.00 355783.89 45572.73 438788.73 00:33:28.953 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:28.953 Verification LBA range: start 0x100 length 0x100 00:33:28.953 crypto_ram3 : 5.79 232.31 14.52 0.00 0.00 474921.21 7360.20 483958.15 00:33:28.953 =================================================================================================================== 00:33:28.953 Total : 1226.00 76.62 0.00 0.00 769756.40 7360.20 3019898.88 00:33:29.215 00:33:29.215 real 0m8.815s 00:33:29.215 user 0m16.953s 00:33:29.215 sys 0m0.303s 00:33:29.215 13:42:09 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:29.215 13:42:09 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:29.215 ************************************ 00:33:29.215 END TEST bdev_verify_big_io 00:33:29.215 ************************************ 00:33:29.215 13:42:09 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:29.215 13:42:09 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:33:29.215 13:42:09 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:29.215 13:42:09 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:29.215 ************************************ 00:33:29.215 START TEST bdev_write_zeroes 00:33:29.215 ************************************ 00:33:29.215 13:42:09 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:29.475 [2024-07-25 13:42:10.022988] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:29.475 [2024-07-25 13:42:10.023033] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1116861 ] 00:33:29.475 [2024-07-25 13:42:10.111400] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:29.475 [2024-07-25 13:42:10.181650] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:29.475 [2024-07-25 13:42:10.202719] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:29.475 [2024-07-25 13:42:10.210745] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:29.475 [2024-07-25 13:42:10.218763] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:29.735 [2024-07-25 13:42:10.302195] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:32.278 [2024-07-25 13:42:12.453097] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:32.278 [2024-07-25 13:42:12.453145] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:32.278 [2024-07-25 13:42:12.453153] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:32.278 [2024-07-25 13:42:12.461115] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:32.278 [2024-07-25 13:42:12.461125] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:32.278 [2024-07-25 13:42:12.461131] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:32.278 [2024-07-25 13:42:12.469135] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:32.278 [2024-07-25 13:42:12.469145] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:32.278 [2024-07-25 13:42:12.469150] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:32.278 [2024-07-25 13:42:12.477154] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:32.278 [2024-07-25 13:42:12.477164] bdev.c:8190:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:32.278 [2024-07-25 13:42:12.477170] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:32.278 Running I/O for 1 seconds... 00:33:32.849 00:33:32.849 Latency(us) 00:33:32.849 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:32.849 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:32.849 crypto_ram : 1.02 2350.71 9.18 0.00 0.00 54133.17 4814.38 65737.65 00:33:32.849 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:32.849 crypto_ram1 : 1.02 2356.29 9.20 0.00 0.00 53734.92 4814.38 60898.07 00:33:32.849 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:32.849 crypto_ram2 : 1.02 18189.81 71.05 0.00 0.00 6948.99 2129.92 9225.45 00:33:32.849 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:32.849 crypto_ram3 : 1.02 18168.62 70.97 0.00 0.00 6929.98 2129.92 7208.96 00:33:32.849 =================================================================================================================== 00:33:32.849 Total : 41065.43 160.41 0.00 0.00 12344.95 2129.92 65737.65 00:33:33.109 00:33:33.109 real 0m3.847s 00:33:33.109 user 0m3.560s 00:33:33.109 sys 0m0.251s 00:33:33.109 13:42:13 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:33.109 13:42:13 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:33.109 ************************************ 00:33:33.109 END TEST bdev_write_zeroes 00:33:33.109 ************************************ 00:33:33.109 13:42:13 blockdev_crypto_qat -- bdev/blockdev.sh@781 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:33.109 13:42:13 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:33:33.109 13:42:13 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:33.109 13:42:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:33.109 ************************************ 00:33:33.109 START TEST bdev_json_nonenclosed 00:33:33.109 ************************************ 00:33:33.109 13:42:13 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:33.369 [2024-07-25 13:42:13.949564] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:33.369 [2024-07-25 13:42:13.949618] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1117708 ] 00:33:33.369 [2024-07-25 13:42:14.039403] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:33.369 [2024-07-25 13:42:14.115393] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:33.369 [2024-07-25 13:42:14.115449] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:33.369 [2024-07-25 13:42:14.115460] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:33.369 [2024-07-25 13:42:14.115467] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:33.630 00:33:33.630 real 0m0.279s 00:33:33.630 user 0m0.169s 00:33:33.630 sys 0m0.108s 00:33:33.630 13:42:14 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:33.630 13:42:14 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:33.630 ************************************ 00:33:33.630 END TEST bdev_json_nonenclosed 00:33:33.630 ************************************ 00:33:33.630 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@784 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:33.630 13:42:14 blockdev_crypto_qat -- common/autotest_common.sh@1101 -- # '[' 13 -le 1 ']' 00:33:33.630 13:42:14 blockdev_crypto_qat -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:33.630 13:42:14 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:33.630 ************************************ 00:33:33.630 START TEST bdev_json_nonarray 00:33:33.630 ************************************ 00:33:33.631 13:42:14 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:33.631 [2024-07-25 13:42:14.307745] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:33.631 [2024-07-25 13:42:14.307799] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1117750 ] 00:33:33.631 [2024-07-25 13:42:14.399010] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:33.892 [2024-07-25 13:42:14.474999] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:33.892 [2024-07-25 13:42:14.475060] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:33.892 [2024-07-25 13:42:14.475070] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:33.892 [2024-07-25 13:42:14.475077] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:33.892 00:33:33.892 real 0m0.279s 00:33:33.892 user 0m0.166s 00:33:33.892 sys 0m0.112s 00:33:33.892 13:42:14 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:33.892 13:42:14 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:33.892 ************************************ 00:33:33.892 END TEST bdev_json_nonarray 00:33:33.892 ************************************ 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@786 -- # [[ crypto_qat == bdev ]] 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@793 -- # [[ crypto_qat == gpt ]] 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@797 -- # [[ crypto_qat == crypto_sw ]] 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@809 -- # trap - SIGINT SIGTERM EXIT 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # cleanup 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:33:33.892 13:42:14 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:33:33.892 00:33:33.892 real 1m10.635s 00:33:33.892 user 2m48.444s 00:33:33.892 sys 0m6.774s 00:33:33.892 13:42:14 blockdev_crypto_qat -- common/autotest_common.sh@1126 -- # xtrace_disable 00:33:33.892 13:42:14 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:33.892 ************************************ 00:33:33.892 END TEST blockdev_crypto_qat 00:33:33.892 ************************************ 00:33:33.892 13:42:14 -- spdk/autotest.sh@364 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:33:33.892 13:42:14 -- common/autotest_common.sh@1101 -- # '[' 2 -le 1 ']' 00:33:33.892 13:42:14 -- common/autotest_common.sh@1107 -- # xtrace_disable 00:33:33.892 13:42:14 -- common/autotest_common.sh@10 -- # set +x 00:33:33.892 ************************************ 00:33:33.892 START TEST chaining 00:33:33.892 ************************************ 00:33:33.892 13:42:14 chaining -- common/autotest_common.sh@1125 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:33:34.153 * Looking for test storage... 00:33:34.153 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:34.153 13:42:14 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@7 -- # uname -s 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=80f8a7aa-1216-ec11-9bc7-a4bf018b228a 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:34.153 13:42:14 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:33:34.153 13:42:14 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:34.153 13:42:14 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:34.153 13:42:14 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:34.153 13:42:14 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:34.153 13:42:14 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:34.153 13:42:14 chaining -- paths/export.sh@5 -- # export PATH 00:33:34.153 13:42:14 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@47 -- # : 0 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:33:34.153 13:42:14 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:33:34.153 13:42:14 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:33:34.153 13:42:14 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:33:34.153 13:42:14 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:33:34.153 13:42:14 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:33:34.153 13:42:14 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:34.153 13:42:14 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:34.153 13:42:14 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:33:34.153 13:42:14 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:33:34.153 13:42:14 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@296 -- # e810=() 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@297 -- # x722=() 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@298 -- # mlx=() 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:33:42.290 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:33:42.290 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:33:42.290 Found net devices under 0000:4b:00.0: cvl_0_0 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:33:42.290 Found net devices under 0000:4b:00.1: cvl_0_1 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:33:42.290 13:42:23 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:33:42.551 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:33:42.551 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.603 ms 00:33:42.551 00:33:42.551 --- 10.0.0.2 ping statistics --- 00:33:42.551 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:42.551 rtt min/avg/max/mdev = 0.603/0.603/0.603/0.000 ms 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:33:42.551 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:33:42.551 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.223 ms 00:33:42.551 00:33:42.551 --- 10.0.0.1 ping statistics --- 00:33:42.551 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:42.551 rtt min/avg/max/mdev = 0.223/0.223/0.223/0.000 ms 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@422 -- # return 0 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:33:42.551 13:42:23 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:33:42.810 13:42:23 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:33:42.810 13:42:23 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:42.810 13:42:23 chaining -- nvmf/common.sh@481 -- # nvmfpid=1121995 00:33:42.810 13:42:23 chaining -- nvmf/common.sh@482 -- # waitforlisten 1121995 00:33:42.810 13:42:23 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@831 -- # '[' -z 1121995 ']' 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:42.810 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:42.810 13:42:23 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:42.810 [2024-07-25 13:42:23.451916] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:42.810 [2024-07-25 13:42:23.451977] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:42.810 [2024-07-25 13:42:23.551314] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:43.071 [2024-07-25 13:42:23.659915] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:33:43.071 [2024-07-25 13:42:23.659980] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:33:43.071 [2024-07-25 13:42:23.659990] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:33:43.071 [2024-07-25 13:42:23.659999] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:33:43.071 [2024-07-25 13:42:23.660007] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:33:43.071 [2024-07-25 13:42:23.660037] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:33:43.642 13:42:24 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:43.642 13:42:24 chaining -- common/autotest_common.sh@864 -- # return 0 00:33:43.642 13:42:24 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:33:43.642 13:42:24 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:33:43.642 13:42:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:43.643 13:42:24 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:33:43.643 13:42:24 chaining -- bdev/chaining.sh@69 -- # mktemp 00:33:43.643 13:42:24 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.6CtxA1hvQ5 00:33:43.643 13:42:24 chaining -- bdev/chaining.sh@69 -- # mktemp 00:33:43.643 13:42:24 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.V6b8cpgCsW 00:33:43.643 13:42:24 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:33:43.643 13:42:24 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:33:43.643 13:42:24 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:43.643 13:42:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:43.643 malloc0 00:33:43.643 true 00:33:43.643 true 00:33:43.643 [2024-07-25 13:42:24.407347] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:43.643 crypto0 00:33:43.643 [2024-07-25 13:42:24.415373] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:33:43.643 crypto1 00:33:43.643 [2024-07-25 13:42:24.423521] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:33:43.905 [2024-07-25 13:42:24.439798] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@85 -- # update_stats 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:43.905 13:42:24 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.6CtxA1hvQ5 bs=1K count=64 00:33:43.905 64+0 records in 00:33:43.905 64+0 records out 00:33:43.905 65536 bytes (66 kB, 64 KiB) copied, 0.00103928 s, 63.1 MB/s 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.6CtxA1hvQ5 --ob Nvme0n1 --bs 65536 --count 1 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@25 -- # local config 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:43.905 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:43.905 "subsystems": [ 00:33:43.905 { 00:33:43.905 "subsystem": "bdev", 00:33:43.905 "config": [ 00:33:43.905 { 00:33:43.905 "method": "bdev_nvme_attach_controller", 00:33:43.905 "params": { 00:33:43.905 "trtype": "tcp", 00:33:43.905 "adrfam": "IPv4", 00:33:43.905 "name": "Nvme0", 00:33:43.905 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:43.905 "traddr": "10.0.0.2", 00:33:43.905 "trsvcid": "4420" 00:33:43.905 } 00:33:43.905 }, 00:33:43.905 { 00:33:43.905 "method": "bdev_set_options", 00:33:43.905 "params": { 00:33:43.905 "bdev_auto_examine": false 00:33:43.905 } 00:33:43.905 } 00:33:43.905 ] 00:33:43.905 } 00:33:43.905 ] 00:33:43.905 }' 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.6CtxA1hvQ5 --ob Nvme0n1 --bs 65536 --count 1 00:33:43.905 13:42:24 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:43.905 "subsystems": [ 00:33:43.905 { 00:33:43.905 "subsystem": "bdev", 00:33:43.905 "config": [ 00:33:43.905 { 00:33:43.905 "method": "bdev_nvme_attach_controller", 00:33:43.905 "params": { 00:33:43.905 "trtype": "tcp", 00:33:43.905 "adrfam": "IPv4", 00:33:43.905 "name": "Nvme0", 00:33:43.905 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:43.905 "traddr": "10.0.0.2", 00:33:43.905 "trsvcid": "4420" 00:33:43.905 } 00:33:43.905 }, 00:33:43.905 { 00:33:43.905 "method": "bdev_set_options", 00:33:43.905 "params": { 00:33:43.905 "bdev_auto_examine": false 00:33:43.905 } 00:33:43.905 } 00:33:43.905 ] 00:33:43.905 } 00:33:43.905 ] 00:33:43.905 }' 00:33:44.166 [2024-07-25 13:42:24.720575] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:44.166 [2024-07-25 13:42:24.720636] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122274 ] 00:33:44.166 [2024-07-25 13:42:24.809866] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:44.166 [2024-07-25 13:42:24.903581] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:44.688  Copying: 64/64 [kB] (average 31 MBps) 00:33:44.688 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:44.688 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.688 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.688 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:44.688 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.688 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.688 13:42:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:44.688 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@96 -- # update_stats 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:44.949 13:42:25 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.949 13:42:25 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:45.210 13:42:25 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:45.211 13:42:25 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.V6b8cpgCsW --ib Nvme0n1 --bs 65536 --count 1 00:33:45.211 13:42:25 chaining -- bdev/chaining.sh@25 -- # local config 00:33:45.211 13:42:25 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:45.211 13:42:25 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:45.211 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:45.211 13:42:25 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:45.211 "subsystems": [ 00:33:45.211 { 00:33:45.211 "subsystem": "bdev", 00:33:45.211 "config": [ 00:33:45.211 { 00:33:45.211 "method": "bdev_nvme_attach_controller", 00:33:45.211 "params": { 00:33:45.211 "trtype": "tcp", 00:33:45.211 "adrfam": "IPv4", 00:33:45.211 "name": "Nvme0", 00:33:45.211 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:45.211 "traddr": "10.0.0.2", 00:33:45.211 "trsvcid": "4420" 00:33:45.211 } 00:33:45.211 }, 00:33:45.211 { 00:33:45.211 "method": "bdev_set_options", 00:33:45.211 "params": { 00:33:45.211 "bdev_auto_examine": false 00:33:45.211 } 00:33:45.211 } 00:33:45.211 ] 00:33:45.211 } 00:33:45.211 ] 00:33:45.211 }' 00:33:45.211 13:42:25 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.V6b8cpgCsW --ib Nvme0n1 --bs 65536 --count 1 00:33:45.211 13:42:25 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:45.211 "subsystems": [ 00:33:45.211 { 00:33:45.211 "subsystem": "bdev", 00:33:45.211 "config": [ 00:33:45.211 { 00:33:45.211 "method": "bdev_nvme_attach_controller", 00:33:45.211 "params": { 00:33:45.211 "trtype": "tcp", 00:33:45.211 "adrfam": "IPv4", 00:33:45.211 "name": "Nvme0", 00:33:45.211 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:45.211 "traddr": "10.0.0.2", 00:33:45.211 "trsvcid": "4420" 00:33:45.211 } 00:33:45.211 }, 00:33:45.211 { 00:33:45.211 "method": "bdev_set_options", 00:33:45.211 "params": { 00:33:45.211 "bdev_auto_examine": false 00:33:45.211 } 00:33:45.211 } 00:33:45.211 ] 00:33:45.211 } 00:33:45.211 ] 00:33:45.211 }' 00:33:45.211 [2024-07-25 13:42:25.860215] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:45.211 [2024-07-25 13:42:25.860281] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122390 ] 00:33:45.211 [2024-07-25 13:42:25.950979] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:45.484 [2024-07-25 13:42:26.045073] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:46.020  Copying: 64/64 [kB] (average 62 MBps) 00:33:46.020 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:46.020 13:42:26 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.6CtxA1hvQ5 /tmp/tmp.V6b8cpgCsW 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@25 -- # local config 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:46.020 13:42:26 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:46.020 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:46.282 13:42:26 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:46.282 "subsystems": [ 00:33:46.282 { 00:33:46.282 "subsystem": "bdev", 00:33:46.282 "config": [ 00:33:46.282 { 00:33:46.282 "method": "bdev_nvme_attach_controller", 00:33:46.282 "params": { 00:33:46.282 "trtype": "tcp", 00:33:46.282 "adrfam": "IPv4", 00:33:46.282 "name": "Nvme0", 00:33:46.282 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:46.282 "traddr": "10.0.0.2", 00:33:46.282 "trsvcid": "4420" 00:33:46.282 } 00:33:46.282 }, 00:33:46.282 { 00:33:46.282 "method": "bdev_set_options", 00:33:46.282 "params": { 00:33:46.282 "bdev_auto_examine": false 00:33:46.282 } 00:33:46.282 } 00:33:46.282 ] 00:33:46.282 } 00:33:46.282 ] 00:33:46.282 }' 00:33:46.282 13:42:26 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:33:46.282 13:42:26 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:46.282 "subsystems": [ 00:33:46.282 { 00:33:46.282 "subsystem": "bdev", 00:33:46.282 "config": [ 00:33:46.282 { 00:33:46.282 "method": "bdev_nvme_attach_controller", 00:33:46.282 "params": { 00:33:46.282 "trtype": "tcp", 00:33:46.282 "adrfam": "IPv4", 00:33:46.282 "name": "Nvme0", 00:33:46.282 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:46.282 "traddr": "10.0.0.2", 00:33:46.282 "trsvcid": "4420" 00:33:46.282 } 00:33:46.282 }, 00:33:46.282 { 00:33:46.282 "method": "bdev_set_options", 00:33:46.282 "params": { 00:33:46.282 "bdev_auto_examine": false 00:33:46.282 } 00:33:46.282 } 00:33:46.282 ] 00:33:46.282 } 00:33:46.282 ] 00:33:46.282 }' 00:33:46.282 [2024-07-25 13:42:26.891791] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:46.282 [2024-07-25 13:42:26.891862] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122709 ] 00:33:46.282 [2024-07-25 13:42:26.985913] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:46.543 [2024-07-25 13:42:27.077350] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:46.804  Copying: 64/64 [kB] (average 15 MBps) 00:33:46.804 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@106 -- # update_stats 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:46.804 13:42:27 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:46.804 13:42:27 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:47.065 13:42:27 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:47.066 13:42:27 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:47.066 13:42:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:47.066 13:42:27 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.6CtxA1hvQ5 --ob Nvme0n1 --bs 4096 --count 16 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@25 -- # local config 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:47.066 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:47.066 "subsystems": [ 00:33:47.066 { 00:33:47.066 "subsystem": "bdev", 00:33:47.066 "config": [ 00:33:47.066 { 00:33:47.066 "method": "bdev_nvme_attach_controller", 00:33:47.066 "params": { 00:33:47.066 "trtype": "tcp", 00:33:47.066 "adrfam": "IPv4", 00:33:47.066 "name": "Nvme0", 00:33:47.066 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:47.066 "traddr": "10.0.0.2", 00:33:47.066 "trsvcid": "4420" 00:33:47.066 } 00:33:47.066 }, 00:33:47.066 { 00:33:47.066 "method": "bdev_set_options", 00:33:47.066 "params": { 00:33:47.066 "bdev_auto_examine": false 00:33:47.066 } 00:33:47.066 } 00:33:47.066 ] 00:33:47.066 } 00:33:47.066 ] 00:33:47.066 }' 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.6CtxA1hvQ5 --ob Nvme0n1 --bs 4096 --count 16 00:33:47.066 13:42:27 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:47.066 "subsystems": [ 00:33:47.066 { 00:33:47.066 "subsystem": "bdev", 00:33:47.066 "config": [ 00:33:47.066 { 00:33:47.066 "method": "bdev_nvme_attach_controller", 00:33:47.066 "params": { 00:33:47.066 "trtype": "tcp", 00:33:47.066 "adrfam": "IPv4", 00:33:47.066 "name": "Nvme0", 00:33:47.066 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:47.066 "traddr": "10.0.0.2", 00:33:47.066 "trsvcid": "4420" 00:33:47.066 } 00:33:47.066 }, 00:33:47.066 { 00:33:47.066 "method": "bdev_set_options", 00:33:47.066 "params": { 00:33:47.066 "bdev_auto_examine": false 00:33:47.066 } 00:33:47.066 } 00:33:47.066 ] 00:33:47.066 } 00:33:47.066 ] 00:33:47.066 }' 00:33:47.066 [2024-07-25 13:42:27.756962] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:47.066 [2024-07-25 13:42:27.757027] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1122747 ] 00:33:47.066 [2024-07-25 13:42:27.850326] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:47.327 [2024-07-25 13:42:27.942204] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:47.848  Copying: 64/64 [kB] (average 9142 kBps) 00:33:47.848 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:47.849 13:42:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:47.849 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@114 -- # update_stats 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:48.110 13:42:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:48.110 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:48.111 13:42:28 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.111 13:42:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:48.111 13:42:28 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@117 -- # : 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.V6b8cpgCsW --ib Nvme0n1 --bs 4096 --count 16 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@25 -- # local config 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:48.111 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:48.111 13:42:28 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:48.111 "subsystems": [ 00:33:48.111 { 00:33:48.111 "subsystem": "bdev", 00:33:48.111 "config": [ 00:33:48.111 { 00:33:48.111 "method": "bdev_nvme_attach_controller", 00:33:48.111 "params": { 00:33:48.111 "trtype": "tcp", 00:33:48.111 "adrfam": "IPv4", 00:33:48.111 "name": "Nvme0", 00:33:48.111 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:48.111 "traddr": "10.0.0.2", 00:33:48.111 "trsvcid": "4420" 00:33:48.111 } 00:33:48.111 }, 00:33:48.111 { 00:33:48.111 "method": "bdev_set_options", 00:33:48.111 "params": { 00:33:48.111 "bdev_auto_examine": false 00:33:48.111 } 00:33:48.111 } 00:33:48.111 ] 00:33:48.111 } 00:33:48.111 ] 00:33:48.111 }' 00:33:48.372 13:42:28 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.V6b8cpgCsW --ib Nvme0n1 --bs 4096 --count 16 00:33:48.372 13:42:28 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:48.372 "subsystems": [ 00:33:48.372 { 00:33:48.372 "subsystem": "bdev", 00:33:48.372 "config": [ 00:33:48.372 { 00:33:48.372 "method": "bdev_nvme_attach_controller", 00:33:48.372 "params": { 00:33:48.372 "trtype": "tcp", 00:33:48.372 "adrfam": "IPv4", 00:33:48.372 "name": "Nvme0", 00:33:48.372 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:48.372 "traddr": "10.0.0.2", 00:33:48.372 "trsvcid": "4420" 00:33:48.372 } 00:33:48.372 }, 00:33:48.372 { 00:33:48.372 "method": "bdev_set_options", 00:33:48.372 "params": { 00:33:48.372 "bdev_auto_examine": false 00:33:48.372 } 00:33:48.372 } 00:33:48.372 ] 00:33:48.372 } 00:33:48.372 ] 00:33:48.372 }' 00:33:48.372 [2024-07-25 13:42:28.956452] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:48.372 [2024-07-25 13:42:28.956516] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1123078 ] 00:33:48.372 [2024-07-25 13:42:29.046646] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:48.372 [2024-07-25 13:42:29.141008] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:48.944  Copying: 64/64 [kB] (average 719 kBps) 00:33:48.944 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:48.944 13:42:29 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:48.944 13:42:29 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:48.944 13:42:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:48.944 13:42:29 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:49.205 13:42:29 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:49.205 13:42:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:49.205 13:42:29 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:49.205 13:42:29 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:49.205 13:42:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:49.205 13:42:29 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:49.205 13:42:29 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.6CtxA1hvQ5 /tmp/tmp.V6b8cpgCsW 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.6CtxA1hvQ5 /tmp/tmp.V6b8cpgCsW 00:33:49.206 13:42:29 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@117 -- # sync 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@120 -- # set +e 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:33:49.206 rmmod nvme_tcp 00:33:49.206 rmmod nvme_fabrics 00:33:49.206 rmmod nvme_keyring 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@124 -- # set -e 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@125 -- # return 0 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@489 -- # '[' -n 1121995 ']' 00:33:49.206 13:42:29 chaining -- nvmf/common.sh@490 -- # killprocess 1121995 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@950 -- # '[' -z 1121995 ']' 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@954 -- # kill -0 1121995 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@955 -- # uname 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:49.206 13:42:29 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1121995 00:33:49.466 13:42:30 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:33:49.466 13:42:30 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:33:49.466 13:42:30 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1121995' 00:33:49.466 killing process with pid 1121995 00:33:49.466 13:42:30 chaining -- common/autotest_common.sh@969 -- # kill 1121995 00:33:49.466 13:42:30 chaining -- common/autotest_common.sh@974 -- # wait 1121995 00:33:49.466 13:42:30 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:33:49.466 13:42:30 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:33:49.466 13:42:30 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:33:49.466 13:42:30 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:33:49.466 13:42:30 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:33:49.466 13:42:30 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:49.467 13:42:30 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:49.467 13:42:30 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:52.010 13:42:32 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:33:52.010 13:42:32 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:33:52.010 13:42:32 chaining -- bdev/chaining.sh@132 -- # bperfpid=1123632 00:33:52.010 13:42:32 chaining -- bdev/chaining.sh@134 -- # waitforlisten 1123632 00:33:52.010 13:42:32 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:33:52.010 13:42:32 chaining -- common/autotest_common.sh@831 -- # '[' -z 1123632 ']' 00:33:52.010 13:42:32 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:52.010 13:42:32 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:52.011 13:42:32 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:52.011 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:52.011 13:42:32 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:52.011 13:42:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:52.011 [2024-07-25 13:42:32.375508] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:52.011 [2024-07-25 13:42:32.375577] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1123632 ] 00:33:52.011 [2024-07-25 13:42:32.468190] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:52.011 [2024-07-25 13:42:32.562880] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:52.583 13:42:33 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:52.583 13:42:33 chaining -- common/autotest_common.sh@864 -- # return 0 00:33:52.583 13:42:33 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:33:52.583 13:42:33 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:52.583 13:42:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:52.583 malloc0 00:33:52.583 true 00:33:52.843 true 00:33:52.843 [2024-07-25 13:42:33.377828] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:52.843 crypto0 00:33:52.843 [2024-07-25 13:42:33.385850] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:33:52.843 crypto1 00:33:52.843 13:42:33 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:52.843 13:42:33 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:52.843 Running I/O for 5 seconds... 00:33:58.133 00:33:58.133 Latency(us) 00:33:58.133 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:58.133 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:33:58.133 Verification LBA range: start 0x0 length 0x2000 00:33:58.133 crypto1 : 5.01 14261.25 55.71 0.00 0.00 17906.52 5293.29 11443.59 00:33:58.133 =================================================================================================================== 00:33:58.133 Total : 14261.25 55.71 0.00 0.00 17906.52 5293.29 11443.59 00:33:58.133 0 00:33:58.133 13:42:38 chaining -- bdev/chaining.sh@146 -- # killprocess 1123632 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@950 -- # '[' -z 1123632 ']' 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@954 -- # kill -0 1123632 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@955 -- # uname 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1123632 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1123632' 00:33:58.133 killing process with pid 1123632 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@969 -- # kill 1123632 00:33:58.133 Received shutdown signal, test time was about 5.000000 seconds 00:33:58.133 00:33:58.133 Latency(us) 00:33:58.133 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:58.133 =================================================================================================================== 00:33:58.133 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@974 -- # wait 1123632 00:33:58.133 13:42:38 chaining -- bdev/chaining.sh@152 -- # bperfpid=1124663 00:33:58.133 13:42:38 chaining -- bdev/chaining.sh@154 -- # waitforlisten 1124663 00:33:58.133 13:42:38 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@831 -- # '[' -z 1124663 ']' 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:58.133 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:33:58.133 13:42:38 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:58.133 [2024-07-25 13:42:38.760982] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:33:58.133 [2024-07-25 13:42:38.761034] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1124663 ] 00:33:58.133 [2024-07-25 13:42:38.847337] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:58.133 [2024-07-25 13:42:38.909540] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:33:59.074 13:42:39 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:33:59.074 13:42:39 chaining -- common/autotest_common.sh@864 -- # return 0 00:33:59.074 13:42:39 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:33:59.074 13:42:39 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:33:59.074 13:42:39 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:59.074 malloc0 00:33:59.074 true 00:33:59.074 true 00:33:59.074 [2024-07-25 13:42:39.708358] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:33:59.074 [2024-07-25 13:42:39.708394] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:59.074 [2024-07-25 13:42:39.708406] vbdev_passthru.c: 681:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1004ae0 00:33:59.074 [2024-07-25 13:42:39.708413] vbdev_passthru.c: 696:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:59.074 [2024-07-25 13:42:39.709278] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:59.074 [2024-07-25 13:42:39.709295] vbdev_passthru.c: 710:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:33:59.074 pt0 00:33:59.074 [2024-07-25 13:42:39.716384] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:59.074 crypto0 00:33:59.074 [2024-07-25 13:42:39.724404] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:33:59.074 crypto1 00:33:59.074 13:42:39 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:33:59.074 13:42:39 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:59.074 Running I/O for 5 seconds... 00:34:04.371 00:34:04.371 Latency(us) 00:34:04.371 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:04.371 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:04.371 Verification LBA range: start 0x0 length 0x2000 00:34:04.371 crypto1 : 5.01 11089.11 43.32 0.00 0.00 23030.81 5217.67 14014.62 00:34:04.371 =================================================================================================================== 00:34:04.371 Total : 11089.11 43.32 0.00 0.00 23030.81 5217.67 14014.62 00:34:04.371 0 00:34:04.371 13:42:44 chaining -- bdev/chaining.sh@167 -- # killprocess 1124663 00:34:04.371 13:42:44 chaining -- common/autotest_common.sh@950 -- # '[' -z 1124663 ']' 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@954 -- # kill -0 1124663 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@955 -- # uname 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1124663 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1124663' 00:34:04.372 killing process with pid 1124663 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@969 -- # kill 1124663 00:34:04.372 Received shutdown signal, test time was about 5.000000 seconds 00:34:04.372 00:34:04.372 Latency(us) 00:34:04.372 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:04.372 =================================================================================================================== 00:34:04.372 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:04.372 13:42:44 chaining -- common/autotest_common.sh@974 -- # wait 1124663 00:34:04.372 13:42:45 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:34:04.372 13:42:45 chaining -- bdev/chaining.sh@170 -- # killprocess 1124663 00:34:04.372 13:42:45 chaining -- common/autotest_common.sh@950 -- # '[' -z 1124663 ']' 00:34:04.372 13:42:45 chaining -- common/autotest_common.sh@954 -- # kill -0 1124663 00:34:04.372 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 954: kill: (1124663) - No such process 00:34:04.372 13:42:45 chaining -- common/autotest_common.sh@977 -- # echo 'Process with pid 1124663 is not found' 00:34:04.372 Process with pid 1124663 is not found 00:34:04.372 13:42:45 chaining -- bdev/chaining.sh@171 -- # wait 1124663 00:34:04.372 13:42:45 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:04.372 13:42:45 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:04.372 13:42:45 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:34:04.372 13:42:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@296 -- # e810=() 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@297 -- # x722=() 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@298 -- # mlx=() 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.0 (0x8086 - 0x159b)' 00:34:04.372 Found 0000:4b:00.0 (0x8086 - 0x159b) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:4b:00.1 (0x8086 - 0x159b)' 00:34:04.372 Found 0000:4b:00.1 (0x8086 - 0x159b) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.0: cvl_0_0' 00:34:04.372 Found net devices under 0000:4b:00.0: cvl_0_0 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:4b:00.1: cvl_0_1' 00:34:04.372 Found net devices under 0000:4b:00.1: cvl_0_1 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:34:04.372 13:42:45 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:34:04.633 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:34:04.633 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.533 ms 00:34:04.633 00:34:04.633 --- 10.0.0.2 ping statistics --- 00:34:04.633 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:04.633 rtt min/avg/max/mdev = 0.533/0.533/0.533/0.000 ms 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:34:04.633 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:34:04.633 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.286 ms 00:34:04.633 00:34:04.633 --- 10.0.0.1 ping statistics --- 00:34:04.633 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:34:04.633 rtt min/avg/max/mdev = 0.286/0.286/0.286/0.000 ms 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@422 -- # return 0 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:34:04.633 13:42:45 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@724 -- # xtrace_disable 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@481 -- # nvmfpid=1125746 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@482 -- # waitforlisten 1125746 00:34:04.633 13:42:45 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@831 -- # '[' -z 1125746 ']' 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/spdk.sock 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:34:04.633 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:04.633 13:42:45 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:04.894 [2024-07-25 13:42:45.467824] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:34:04.894 [2024-07-25 13:42:45.467887] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:34:04.894 [2024-07-25 13:42:45.567671] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:04.894 [2024-07-25 13:42:45.675737] app.c: 603:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:34:04.894 [2024-07-25 13:42:45.675801] app.c: 604:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:34:04.894 [2024-07-25 13:42:45.675813] app.c: 609:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:34:04.894 [2024-07-25 13:42:45.675822] app.c: 610:app_setup_trace: *NOTICE*: SPDK application currently running. 00:34:04.894 [2024-07-25 13:42:45.675831] app.c: 611:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:34:04.894 [2024-07-25 13:42:45.675860] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 1 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@864 -- # return 0 00:34:05.835 13:42:46 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@730 -- # xtrace_disable 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:05.835 13:42:46 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:34:05.835 13:42:46 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@561 -- # xtrace_disable 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:05.835 malloc0 00:34:05.835 [2024-07-25 13:42:46.388237] tcp.c: 677:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:34:05.835 [2024-07-25 13:42:46.404472] tcp.c:1006:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@589 -- # [[ 0 == 0 ]] 00:34:05.835 13:42:46 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:34:05.835 13:42:46 chaining -- bdev/chaining.sh@189 -- # bperfpid=1125940 00:34:05.835 13:42:46 chaining -- bdev/chaining.sh@191 -- # waitforlisten 1125940 /var/tmp/bperf.sock 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@831 -- # '[' -z 1125940 ']' 00:34:05.835 13:42:46 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:05.835 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:05.835 13:42:46 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:05.835 [2024-07-25 13:42:46.475203] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:34:05.835 [2024-07-25 13:42:46.475261] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1125940 ] 00:34:05.835 [2024-07-25 13:42:46.569125] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:06.095 [2024-07-25 13:42:46.647974] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:06.666 13:42:47 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:06.666 13:42:47 chaining -- common/autotest_common.sh@864 -- # return 0 00:34:06.666 13:42:47 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:34:06.666 13:42:47 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:06.926 [2024-07-25 13:42:47.659161] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:06.926 nvme0n1 00:34:06.926 true 00:34:06.926 crypto0 00:34:06.926 13:42:47 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:07.186 Running I/O for 5 seconds... 00:34:12.469 00:34:12.469 Latency(us) 00:34:12.469 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:12.469 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:34:12.469 Verification LBA range: start 0x0 length 0x2000 00:34:12.469 crypto0 : 5.02 7175.51 28.03 0.00 0.00 35566.69 3125.56 26819.35 00:34:12.469 =================================================================================================================== 00:34:12.469 Total : 7175.51 28.03 0.00 0.00 35566.69 3125.56 26819.35 00:34:12.469 0 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:12.469 13:42:52 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@205 -- # sequence=72106 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@206 -- # encrypt=36053 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:12.469 13:42:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@207 -- # decrypt=36053 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:12.733 13:42:53 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:12.996 13:42:53 chaining -- bdev/chaining.sh@208 -- # crc32c=72106 00:34:12.996 13:42:53 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:34:12.996 13:42:53 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:34:12.996 13:42:53 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:34:12.996 13:42:53 chaining -- bdev/chaining.sh@214 -- # killprocess 1125940 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@950 -- # '[' -z 1125940 ']' 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@954 -- # kill -0 1125940 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@955 -- # uname 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1125940 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1125940' 00:34:12.996 killing process with pid 1125940 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@969 -- # kill 1125940 00:34:12.996 Received shutdown signal, test time was about 5.000000 seconds 00:34:12.996 00:34:12.996 Latency(us) 00:34:12.996 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:12.996 =================================================================================================================== 00:34:12.996 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:12.996 13:42:53 chaining -- common/autotest_common.sh@974 -- # wait 1125940 00:34:13.257 13:42:53 chaining -- bdev/chaining.sh@219 -- # bperfpid=1127179 00:34:13.257 13:42:53 chaining -- bdev/chaining.sh@221 -- # waitforlisten 1127179 /var/tmp/bperf.sock 00:34:13.257 13:42:53 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:34:13.257 13:42:53 chaining -- common/autotest_common.sh@831 -- # '[' -z 1127179 ']' 00:34:13.257 13:42:53 chaining -- common/autotest_common.sh@835 -- # local rpc_addr=/var/tmp/bperf.sock 00:34:13.257 13:42:53 chaining -- common/autotest_common.sh@836 -- # local max_retries=100 00:34:13.257 13:42:53 chaining -- common/autotest_common.sh@838 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:34:13.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:34:13.257 13:42:53 chaining -- common/autotest_common.sh@840 -- # xtrace_disable 00:34:13.257 13:42:53 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:13.257 [2024-07-25 13:42:53.875262] Starting SPDK v24.09-pre git sha1 704257090 / DPDK 24.03.0 initialization... 00:34:13.257 [2024-07-25 13:42:53.875315] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=lib.power:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid1127179 ] 00:34:13.257 [2024-07-25 13:42:53.960960] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:34:13.257 [2024-07-25 13:42:54.031166] reactor.c: 941:reactor_run: *NOTICE*: Reactor started on core 0 00:34:14.197 13:42:54 chaining -- common/autotest_common.sh@860 -- # (( i == 0 )) 00:34:14.197 13:42:54 chaining -- common/autotest_common.sh@864 -- # return 0 00:34:14.197 13:42:54 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:34:14.197 13:42:54 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:34:14.457 [2024-07-25 13:42:55.045100] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:34:14.457 nvme0n1 00:34:14.457 true 00:34:14.457 crypto0 00:34:14.457 13:42:55 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:34:14.457 Running I/O for 5 seconds... 00:34:19.738 00:34:19.738 Latency(us) 00:34:19.738 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:19.738 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:34:19.738 Verification LBA range: start 0x0 length 0x200 00:34:19.738 crypto0 : 5.01 2181.48 136.34 0.00 0.00 14370.24 1203.59 18047.61 00:34:19.738 =================================================================================================================== 00:34:19.738 Total : 2181.48 136.34 0.00 0.00 14370.24 1203.59 18047.61 00:34:19.738 0 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@233 -- # sequence=21848 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:19.739 13:43:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:19.999 13:43:00 chaining -- bdev/chaining.sh@234 -- # encrypt=10924 00:34:19.999 13:43:00 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:34:19.999 13:43:00 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:19.999 13:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:19.999 13:43:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:19.999 13:43:00 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:19.999 13:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:20.000 13:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:20.000 13:43:00 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:20.000 13:43:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:20.000 13:43:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@235 -- # decrypt=10924 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:20.260 13:43:00 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:20.260 13:43:01 chaining -- bdev/chaining.sh@236 -- # crc32c=21848 00:34:20.260 13:43:01 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:34:20.260 13:43:01 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:34:20.260 13:43:01 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:34:20.260 13:43:01 chaining -- bdev/chaining.sh@242 -- # killprocess 1127179 00:34:20.260 13:43:01 chaining -- common/autotest_common.sh@950 -- # '[' -z 1127179 ']' 00:34:20.260 13:43:01 chaining -- common/autotest_common.sh@954 -- # kill -0 1127179 00:34:20.260 13:43:01 chaining -- common/autotest_common.sh@955 -- # uname 00:34:20.260 13:43:01 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:20.260 13:43:01 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1127179 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_0 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@960 -- # '[' reactor_0 = sudo ']' 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1127179' 00:34:20.520 killing process with pid 1127179 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@969 -- # kill 1127179 00:34:20.520 Received shutdown signal, test time was about 5.000000 seconds 00:34:20.520 00:34:20.520 Latency(us) 00:34:20.520 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:20.520 =================================================================================================================== 00:34:20.520 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@974 -- # wait 1127179 00:34:20.520 13:43:01 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@117 -- # sync 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@120 -- # set +e 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:20.520 rmmod nvme_tcp 00:34:20.520 rmmod nvme_fabrics 00:34:20.520 rmmod nvme_keyring 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@124 -- # set -e 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@125 -- # return 0 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@489 -- # '[' -n 1125746 ']' 00:34:20.520 13:43:01 chaining -- nvmf/common.sh@490 -- # killprocess 1125746 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@950 -- # '[' -z 1125746 ']' 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@954 -- # kill -0 1125746 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@955 -- # uname 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@955 -- # '[' Linux = Linux ']' 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@956 -- # ps --no-headers -o comm= 1125746 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@956 -- # process_name=reactor_1 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@960 -- # '[' reactor_1 = sudo ']' 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@968 -- # echo 'killing process with pid 1125746' 00:34:20.520 killing process with pid 1125746 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@969 -- # kill 1125746 00:34:20.520 13:43:01 chaining -- common/autotest_common.sh@974 -- # wait 1125746 00:34:20.819 13:43:01 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:20.819 13:43:01 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:20.819 13:43:01 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:20.819 13:43:01 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:20.819 13:43:01 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:20.819 13:43:01 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:20.819 13:43:01 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:20.819 13:43:01 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:22.787 13:43:03 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:34:22.787 13:43:03 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:34:22.787 00:34:22.787 real 0m48.911s 00:34:22.787 user 0m58.746s 00:34:22.787 sys 0m11.605s 00:34:22.787 13:43:03 chaining -- common/autotest_common.sh@1126 -- # xtrace_disable 00:34:22.787 13:43:03 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:22.787 ************************************ 00:34:22.787 END TEST chaining 00:34:22.787 ************************************ 00:34:23.047 13:43:03 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:34:23.047 13:43:03 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:34:23.047 13:43:03 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:34:23.047 13:43:03 -- spdk/autotest.sh@379 -- # [[ 0 -eq 1 ]] 00:34:23.047 13:43:03 -- spdk/autotest.sh@384 -- # trap - SIGINT SIGTERM EXIT 00:34:23.047 13:43:03 -- spdk/autotest.sh@386 -- # timing_enter post_cleanup 00:34:23.047 13:43:03 -- common/autotest_common.sh@724 -- # xtrace_disable 00:34:23.047 13:43:03 -- common/autotest_common.sh@10 -- # set +x 00:34:23.047 13:43:03 -- spdk/autotest.sh@387 -- # autotest_cleanup 00:34:23.047 13:43:03 -- common/autotest_common.sh@1392 -- # local autotest_es=0 00:34:23.047 13:43:03 -- common/autotest_common.sh@1393 -- # xtrace_disable 00:34:23.047 13:43:03 -- common/autotest_common.sh@10 -- # set +x 00:34:29.628 INFO: APP EXITING 00:34:29.628 INFO: killing all VMs 00:34:29.628 INFO: killing vhost app 00:34:29.628 INFO: EXIT DONE 00:34:33.828 Waiting for block devices as requested 00:34:33.828 0000:80:01.6 (8086 0b00): vfio-pci -> ioatdma 00:34:33.828 0000:80:01.7 (8086 0b00): vfio-pci -> ioatdma 00:34:33.828 0000:80:01.4 (8086 0b00): vfio-pci -> ioatdma 00:34:33.828 0000:80:01.5 (8086 0b00): vfio-pci -> ioatdma 00:34:33.828 0000:80:01.2 (8086 0b00): vfio-pci -> ioatdma 00:34:34.089 0000:80:01.3 (8086 0b00): vfio-pci -> ioatdma 00:34:34.089 0000:80:01.0 (8086 0b00): vfio-pci -> ioatdma 00:34:34.089 0000:80:01.1 (8086 0b00): vfio-pci -> ioatdma 00:34:34.350 0000:65:00.0 (8086 0a54): vfio-pci -> nvme 00:34:34.350 0000:00:01.6 (8086 0b00): vfio-pci -> ioatdma 00:34:34.350 0000:00:01.7 (8086 0b00): vfio-pci -> ioatdma 00:34:34.610 0000:00:01.4 (8086 0b00): vfio-pci -> ioatdma 00:34:34.610 0000:00:01.5 (8086 0b00): vfio-pci -> ioatdma 00:34:34.610 0000:00:01.2 (8086 0b00): vfio-pci -> ioatdma 00:34:34.870 0000:00:01.3 (8086 0b00): vfio-pci -> ioatdma 00:34:34.870 0000:00:01.0 (8086 0b00): vfio-pci -> ioatdma 00:34:34.870 0000:00:01.1 (8086 0b00): vfio-pci -> ioatdma 00:34:40.158 Cleaning 00:34:40.158 Removing: /var/run/dpdk/spdk0/config 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:34:40.158 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:34:40.158 Removing: /var/run/dpdk/spdk0/hugepage_info 00:34:40.158 Removing: /dev/shm/nvmf_trace.0 00:34:40.158 Removing: /dev/shm/spdk_tgt_trace.pid822234 00:34:40.158 Removing: /var/run/dpdk/spdk0 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1005569 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1008588 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1013747 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1017793 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1024647 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1027651 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1034747 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1037229 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1044625 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1048297 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1055965 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1058424 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1064067 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1064384 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1064825 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1065307 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1065721 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1066410 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1067436 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1067840 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1069984 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1072041 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1074075 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1075915 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1077995 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1080003 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1082100 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1083982 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1084622 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1085301 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1087919 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1090321 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1092687 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1094029 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1095560 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1096190 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1096212 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1096294 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1096605 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1096703 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1098032 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1100022 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1101954 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1102869 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1103865 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1104186 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1104208 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1104239 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1105342 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1106089 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1106478 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1109153 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1111457 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1113749 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1115106 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1116861 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1117708 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1117750 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1122274 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1122390 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1122709 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1122747 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1123078 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1123632 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1124663 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1125940 00:34:40.158 Removing: /var/run/dpdk/spdk_pid1127179 00:34:40.158 Removing: /var/run/dpdk/spdk_pid817213 00:34:40.158 Removing: /var/run/dpdk/spdk_pid819996 00:34:40.158 Removing: /var/run/dpdk/spdk_pid822234 00:34:40.158 Removing: /var/run/dpdk/spdk_pid822744 00:34:40.158 Removing: /var/run/dpdk/spdk_pid823684 00:34:40.158 Removing: /var/run/dpdk/spdk_pid823977 00:34:40.158 Removing: /var/run/dpdk/spdk_pid824955 00:34:40.158 Removing: /var/run/dpdk/spdk_pid825186 00:34:40.158 Removing: /var/run/dpdk/spdk_pid825378 00:34:40.158 Removing: /var/run/dpdk/spdk_pid828612 00:34:40.158 Removing: /var/run/dpdk/spdk_pid830542 00:34:40.158 Removing: /var/run/dpdk/spdk_pid830890 00:34:40.158 Removing: /var/run/dpdk/spdk_pid831249 00:34:40.158 Removing: /var/run/dpdk/spdk_pid831623 00:34:40.158 Removing: /var/run/dpdk/spdk_pid831850 00:34:40.158 Removing: /var/run/dpdk/spdk_pid832034 00:34:40.158 Removing: /var/run/dpdk/spdk_pid832346 00:34:40.158 Removing: /var/run/dpdk/spdk_pid832688 00:34:40.158 Removing: /var/run/dpdk/spdk_pid833660 00:34:40.158 Removing: /var/run/dpdk/spdk_pid836737 00:34:40.158 Removing: /var/run/dpdk/spdk_pid836971 00:34:40.158 Removing: /var/run/dpdk/spdk_pid837327 00:34:40.158 Removing: /var/run/dpdk/spdk_pid837667 00:34:40.158 Removing: /var/run/dpdk/spdk_pid837690 00:34:40.158 Removing: /var/run/dpdk/spdk_pid837774 00:34:40.158 Removing: /var/run/dpdk/spdk_pid838074 00:34:40.158 Removing: /var/run/dpdk/spdk_pid838400 00:34:40.158 Removing: /var/run/dpdk/spdk_pid838642 00:34:40.158 Removing: /var/run/dpdk/spdk_pid838768 00:34:40.158 Removing: /var/run/dpdk/spdk_pid839075 00:34:40.158 Removing: /var/run/dpdk/spdk_pid839390 00:34:40.158 Removing: /var/run/dpdk/spdk_pid839716 00:34:40.158 Removing: /var/run/dpdk/spdk_pid839898 00:34:40.158 Removing: /var/run/dpdk/spdk_pid840085 00:34:40.158 Removing: /var/run/dpdk/spdk_pid840390 00:34:40.158 Removing: /var/run/dpdk/spdk_pid840707 00:34:40.158 Removing: /var/run/dpdk/spdk_pid840976 00:34:40.158 Removing: /var/run/dpdk/spdk_pid841085 00:34:40.158 Removing: /var/run/dpdk/spdk_pid841379 00:34:40.158 Removing: /var/run/dpdk/spdk_pid841701 00:34:40.158 Removing: /var/run/dpdk/spdk_pid842015 00:34:40.158 Removing: /var/run/dpdk/spdk_pid842189 00:34:40.158 Removing: /var/run/dpdk/spdk_pid842397 00:34:40.158 Removing: /var/run/dpdk/spdk_pid842699 00:34:40.158 Removing: /var/run/dpdk/spdk_pid843017 00:34:40.158 Removing: /var/run/dpdk/spdk_pid843338 00:34:40.158 Removing: /var/run/dpdk/spdk_pid843661 00:34:40.158 Removing: /var/run/dpdk/spdk_pid843992 00:34:40.158 Removing: /var/run/dpdk/spdk_pid844317 00:34:40.158 Removing: /var/run/dpdk/spdk_pid844640 00:34:40.158 Removing: /var/run/dpdk/spdk_pid844967 00:34:40.158 Removing: /var/run/dpdk/spdk_pid845295 00:34:40.158 Removing: /var/run/dpdk/spdk_pid845620 00:34:40.158 Removing: /var/run/dpdk/spdk_pid845685 00:34:40.158 Removing: /var/run/dpdk/spdk_pid846065 00:34:40.158 Removing: /var/run/dpdk/spdk_pid846491 00:34:40.158 Removing: /var/run/dpdk/spdk_pid846821 00:34:40.158 Removing: /var/run/dpdk/spdk_pid847039 00:34:40.158 Removing: /var/run/dpdk/spdk_pid851554 00:34:40.158 Removing: /var/run/dpdk/spdk_pid854025 00:34:40.158 Removing: /var/run/dpdk/spdk_pid856057 00:34:40.158 Removing: /var/run/dpdk/spdk_pid857270 00:34:40.158 Removing: /var/run/dpdk/spdk_pid858647 00:34:40.158 Removing: /var/run/dpdk/spdk_pid859052 00:34:40.158 Removing: /var/run/dpdk/spdk_pid859156 00:34:40.158 Removing: /var/run/dpdk/spdk_pid859184 00:34:40.158 Removing: /var/run/dpdk/spdk_pid863943 00:34:40.158 Removing: /var/run/dpdk/spdk_pid864527 00:34:40.158 Removing: /var/run/dpdk/spdk_pid865747 00:34:40.158 Removing: /var/run/dpdk/spdk_pid866058 00:34:40.158 Removing: /var/run/dpdk/spdk_pid874824 00:34:40.158 Removing: /var/run/dpdk/spdk_pid876560 00:34:40.158 Removing: /var/run/dpdk/spdk_pid877669 00:34:40.158 Removing: /var/run/dpdk/spdk_pid882194 00:34:40.158 Removing: /var/run/dpdk/spdk_pid884055 00:34:40.158 Removing: /var/run/dpdk/spdk_pid885285 00:34:40.158 Removing: /var/run/dpdk/spdk_pid890238 00:34:40.159 Removing: /var/run/dpdk/spdk_pid892818 00:34:40.159 Removing: /var/run/dpdk/spdk_pid893838 00:34:40.159 Removing: /var/run/dpdk/spdk_pid904325 00:34:40.159 Removing: /var/run/dpdk/spdk_pid906745 00:34:40.159 Removing: /var/run/dpdk/spdk_pid908091 00:34:40.159 Removing: /var/run/dpdk/spdk_pid920895 00:34:40.159 Removing: /var/run/dpdk/spdk_pid923103 00:34:40.159 Removing: /var/run/dpdk/spdk_pid924185 00:34:40.159 Removing: /var/run/dpdk/spdk_pid935505 00:34:40.159 Removing: /var/run/dpdk/spdk_pid939050 00:34:40.159 Removing: /var/run/dpdk/spdk_pid940069 00:34:40.159 Removing: /var/run/dpdk/spdk_pid952902 00:34:40.159 Removing: /var/run/dpdk/spdk_pid957349 00:34:40.159 Removing: /var/run/dpdk/spdk_pid959305 00:34:40.159 Removing: /var/run/dpdk/spdk_pid971127 00:34:40.159 Removing: /var/run/dpdk/spdk_pid973722 00:34:40.159 Removing: /var/run/dpdk/spdk_pid974916 00:34:40.159 Removing: /var/run/dpdk/spdk_pid987176 00:34:40.159 Removing: /var/run/dpdk/spdk_pid993030 00:34:40.159 Removing: /var/run/dpdk/spdk_pid994079 00:34:40.159 Removing: /var/run/dpdk/spdk_pid995309 00:34:40.159 Removing: /var/run/dpdk/spdk_pid999282 00:34:40.159 Clean 00:34:40.159 13:43:20 -- common/autotest_common.sh@1451 -- # return 0 00:34:40.159 13:43:20 -- spdk/autotest.sh@388 -- # timing_exit post_cleanup 00:34:40.159 13:43:20 -- common/autotest_common.sh@730 -- # xtrace_disable 00:34:40.159 13:43:20 -- common/autotest_common.sh@10 -- # set +x 00:34:40.159 13:43:20 -- spdk/autotest.sh@390 -- # timing_exit autotest 00:34:40.159 13:43:20 -- common/autotest_common.sh@730 -- # xtrace_disable 00:34:40.159 13:43:20 -- common/autotest_common.sh@10 -- # set +x 00:34:40.419 13:43:20 -- spdk/autotest.sh@391 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:34:40.419 13:43:20 -- spdk/autotest.sh@393 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:34:40.419 13:43:20 -- spdk/autotest.sh@393 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:34:40.419 13:43:20 -- spdk/autotest.sh@395 -- # hash lcov 00:34:40.420 13:43:20 -- spdk/autotest.sh@395 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:34:40.420 13:43:20 -- spdk/autotest.sh@397 -- # hostname 00:34:40.420 13:43:20 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-CYP-06 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:34:40.420 geninfo: WARNING: invalid characters removed from testname! 00:35:02.378 13:43:42 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:05.679 13:43:45 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:08.220 13:43:48 -- spdk/autotest.sh@400 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:10.757 13:43:50 -- spdk/autotest.sh@401 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:12.673 13:43:53 -- spdk/autotest.sh@402 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:14.648 13:43:55 -- spdk/autotest.sh@403 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:16.557 13:43:57 -- spdk/autotest.sh@404 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:35:16.557 13:43:57 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:16.557 13:43:57 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:35:16.557 13:43:57 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:16.557 13:43:57 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:16.557 13:43:57 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:16.557 13:43:57 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:16.557 13:43:57 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:16.557 13:43:57 -- paths/export.sh@5 -- $ export PATH 00:35:16.557 13:43:57 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:16.557 13:43:57 -- common/autobuild_common.sh@446 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:16.557 13:43:57 -- common/autobuild_common.sh@447 -- $ date +%s 00:35:16.557 13:43:57 -- common/autobuild_common.sh@447 -- $ mktemp -dt spdk_1721907837.XXXXXX 00:35:16.557 13:43:57 -- common/autobuild_common.sh@447 -- $ SPDK_WORKSPACE=/tmp/spdk_1721907837.hg2Byg 00:35:16.557 13:43:57 -- common/autobuild_common.sh@449 -- $ [[ -n '' ]] 00:35:16.557 13:43:57 -- common/autobuild_common.sh@453 -- $ '[' -n '' ']' 00:35:16.557 13:43:57 -- common/autobuild_common.sh@456 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:35:16.557 13:43:57 -- common/autobuild_common.sh@460 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:35:16.557 13:43:57 -- common/autobuild_common.sh@462 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:35:16.558 13:43:57 -- common/autobuild_common.sh@463 -- $ get_config_params 00:35:16.558 13:43:57 -- common/autotest_common.sh@398 -- $ xtrace_disable 00:35:16.558 13:43:57 -- common/autotest_common.sh@10 -- $ set +x 00:35:16.558 13:43:57 -- common/autobuild_common.sh@463 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:35:16.558 13:43:57 -- common/autobuild_common.sh@465 -- $ start_monitor_resources 00:35:16.558 13:43:57 -- pm/common@17 -- $ local monitor 00:35:16.558 13:43:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:16.558 13:43:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:16.558 13:43:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:16.558 13:43:57 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:16.558 13:43:57 -- pm/common@21 -- $ date +%s 00:35:16.558 13:43:57 -- pm/common@25 -- $ sleep 1 00:35:16.558 13:43:57 -- pm/common@21 -- $ date +%s 00:35:16.558 13:43:57 -- pm/common@21 -- $ date +%s 00:35:16.558 13:43:57 -- pm/common@21 -- $ date +%s 00:35:16.558 13:43:57 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721907837 00:35:16.558 13:43:57 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721907837 00:35:16.558 13:43:57 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721907837 00:35:16.558 13:43:57 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1721907837 00:35:16.558 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721907837_collect-vmstat.pm.log 00:35:16.558 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721907837_collect-cpu-load.pm.log 00:35:16.558 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721907837_collect-cpu-temp.pm.log 00:35:16.558 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1721907837_collect-bmc-pm.bmc.pm.log 00:35:17.498 13:43:58 -- common/autobuild_common.sh@466 -- $ trap stop_monitor_resources EXIT 00:35:17.498 13:43:58 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j128 00:35:17.498 13:43:58 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:17.498 13:43:58 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:35:17.498 13:43:58 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:35:17.498 13:43:58 -- spdk/autopackage.sh@19 -- $ timing_finish 00:35:17.498 13:43:58 -- common/autotest_common.sh@736 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:35:17.498 13:43:58 -- common/autotest_common.sh@737 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:35:17.498 13:43:58 -- common/autotest_common.sh@739 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:17.759 13:43:58 -- spdk/autopackage.sh@20 -- $ exit 0 00:35:17.759 13:43:58 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:35:17.759 13:43:58 -- pm/common@29 -- $ signal_monitor_resources TERM 00:35:17.759 13:43:58 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:35:17.759 13:43:58 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:17.759 13:43:58 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:35:17.759 13:43:58 -- pm/common@44 -- $ pid=1140298 00:35:17.759 13:43:58 -- pm/common@50 -- $ kill -TERM 1140298 00:35:17.759 13:43:58 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:17.759 13:43:58 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:35:17.759 13:43:58 -- pm/common@44 -- $ pid=1140299 00:35:17.759 13:43:58 -- pm/common@50 -- $ kill -TERM 1140299 00:35:17.759 13:43:58 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:17.759 13:43:58 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:35:17.759 13:43:58 -- pm/common@44 -- $ pid=1140301 00:35:17.759 13:43:58 -- pm/common@50 -- $ kill -TERM 1140301 00:35:17.759 13:43:58 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:17.759 13:43:58 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:35:17.759 13:43:58 -- pm/common@44 -- $ pid=1140324 00:35:17.759 13:43:58 -- pm/common@50 -- $ sudo -E kill -TERM 1140324 00:35:17.759 + [[ -n 687445 ]] 00:35:17.759 + sudo kill 687445 00:35:17.770 [Pipeline] } 00:35:17.788 [Pipeline] // stage 00:35:17.794 [Pipeline] } 00:35:17.812 [Pipeline] // timeout 00:35:17.818 [Pipeline] } 00:35:17.830 [Pipeline] // catchError 00:35:17.836 [Pipeline] } 00:35:17.852 [Pipeline] // wrap 00:35:17.858 [Pipeline] } 00:35:17.873 [Pipeline] // catchError 00:35:17.882 [Pipeline] stage 00:35:17.885 [Pipeline] { (Epilogue) 00:35:17.900 [Pipeline] catchError 00:35:17.902 [Pipeline] { 00:35:17.918 [Pipeline] echo 00:35:17.920 Cleanup processes 00:35:17.926 [Pipeline] sh 00:35:18.216 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:18.216 1140415 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:35:18.216 1140800 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:18.230 [Pipeline] sh 00:35:18.516 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:18.516 ++ grep -v 'sudo pgrep' 00:35:18.516 ++ awk '{print $1}' 00:35:18.516 + sudo kill -9 1140415 00:35:18.528 [Pipeline] sh 00:35:18.814 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:35:31.050 [Pipeline] sh 00:35:31.338 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:35:31.338 Artifacts sizes are good 00:35:31.355 [Pipeline] archiveArtifacts 00:35:31.362 Archiving artifacts 00:35:31.520 [Pipeline] sh 00:35:31.858 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:35:31.875 [Pipeline] cleanWs 00:35:31.885 [WS-CLEANUP] Deleting project workspace... 00:35:31.886 [WS-CLEANUP] Deferred wipeout is used... 00:35:31.893 [WS-CLEANUP] done 00:35:31.895 [Pipeline] } 00:35:31.915 [Pipeline] // catchError 00:35:31.927 [Pipeline] sh 00:35:32.210 + logger -p user.info -t JENKINS-CI 00:35:32.221 [Pipeline] } 00:35:32.239 [Pipeline] // stage 00:35:32.244 [Pipeline] } 00:35:32.260 [Pipeline] // node 00:35:32.266 [Pipeline] End of Pipeline 00:35:32.296 Finished: SUCCESS